Apr 24 14:23:51.770922 ip-10-0-129-77 systemd[1]: Starting Kubernetes Kubelet... Apr 24 14:23:52.154796 ip-10-0-129-77 kubenswrapper[2570]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 14:23:52.154796 ip-10-0-129-77 kubenswrapper[2570]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 14:23:52.154796 ip-10-0-129-77 kubenswrapper[2570]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 14:23:52.154796 ip-10-0-129-77 kubenswrapper[2570]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 14:23:52.154796 ip-10-0-129-77 kubenswrapper[2570]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 14:23:52.157578 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.157488 2570 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 14:23:52.161885 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.161858 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:23:52.161885 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.161886 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:23:52.162012 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.161892 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:23:52.162012 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.161899 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:23:52.162012 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.161904 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:23:52.162012 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.161911 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:23:52.162012 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.161916 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:23:52.162012 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.161923 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:23:52.162012 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.161932 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:23:52.162012 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.161975 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:23:52.162208 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162102 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:23:52.162208 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162108 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:23:52.162208 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162111 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:23:52.162208 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162115 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:23:52.162208 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162118 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:23:52.162208 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162120 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:23:52.162208 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162123 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:23:52.162208 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162125 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:23:52.162208 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162128 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:23:52.162208 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162131 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:23:52.162208 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162134 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:23:52.162208 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162136 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:23:52.162208 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162140 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:23:52.162208 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162142 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:23:52.162208 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162145 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:23:52.162208 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162148 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:23:52.162208 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162151 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:23:52.162208 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162153 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:23:52.162208 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162156 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:23:52.162208 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162158 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:23:52.162705 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162161 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:23:52.162705 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162164 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:23:52.162705 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162170 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:23:52.162705 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162175 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:23:52.162705 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162178 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:23:52.162705 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162180 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:23:52.162705 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162184 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:23:52.162705 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162186 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:23:52.162705 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162189 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:23:52.162705 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162192 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:23:52.162705 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162194 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:23:52.162705 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162197 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:23:52.162705 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162200 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:23:52.162705 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162203 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:23:52.162705 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162206 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:23:52.162705 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162209 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:23:52.162705 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162212 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:23:52.162705 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162215 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:23:52.162705 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162217 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:23:52.162705 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162220 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:23:52.163188 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162223 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:23:52.163188 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162225 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:23:52.163188 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162229 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:23:52.163188 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162232 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:23:52.163188 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162234 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:23:52.163188 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162237 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:23:52.163188 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162239 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:23:52.163188 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162242 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:23:52.163188 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162245 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:23:52.163188 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162248 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:23:52.163188 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162251 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:23:52.163188 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162254 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:23:52.163188 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162256 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:23:52.163188 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162259 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:23:52.163188 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162261 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:23:52.163188 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162264 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:23:52.163188 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162266 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:23:52.163188 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162269 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:23:52.163188 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162271 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:23:52.163188 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162275 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:23:52.163715 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162278 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:23:52.163715 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162281 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:23:52.163715 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162283 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:23:52.163715 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162285 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:23:52.163715 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162288 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:23:52.163715 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162293 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:23:52.163715 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162295 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:23:52.163715 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162298 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:23:52.163715 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162301 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:23:52.163715 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162304 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:23:52.163715 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162307 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:23:52.163715 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162309 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:23:52.163715 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162312 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:23:52.163715 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162314 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:23:52.163715 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162317 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:23:52.163715 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.162319 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:23:52.164101 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.163960 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:23:52.164101 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.163968 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:23:52.164101 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.163972 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:23:52.164101 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.163975 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:23:52.164101 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.163978 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:23:52.164101 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.163981 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:23:52.164101 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.163984 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:23:52.164101 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.163987 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:23:52.164101 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.163991 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:23:52.164101 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.163994 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:23:52.164101 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.163997 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:23:52.164101 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.163999 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:23:52.164101 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164002 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:23:52.164101 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164005 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:23:52.164101 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164009 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:23:52.164101 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164012 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:23:52.164101 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164015 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:23:52.164101 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164018 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:23:52.164101 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164021 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:23:52.164101 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164023 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:23:52.164570 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164026 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:23:52.164570 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164028 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:23:52.164570 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164031 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:23:52.164570 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164033 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:23:52.164570 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164036 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:23:52.164570 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164039 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:23:52.164570 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164041 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:23:52.164570 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164043 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:23:52.164570 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164046 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:23:52.164570 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164049 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:23:52.164570 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164051 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:23:52.164570 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164054 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:23:52.164570 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164056 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:23:52.164570 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164059 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:23:52.164570 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164061 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:23:52.164570 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164064 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:23:52.164570 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164066 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:23:52.164570 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164069 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:23:52.164570 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164071 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:23:52.165050 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164074 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:23:52.165050 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164076 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:23:52.165050 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164078 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:23:52.165050 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164081 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:23:52.165050 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164083 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:23:52.165050 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164086 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:23:52.165050 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164088 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:23:52.165050 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164091 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:23:52.165050 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164093 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:23:52.165050 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164096 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:23:52.165050 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164098 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:23:52.165050 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164101 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:23:52.165050 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164104 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:23:52.165050 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164106 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:23:52.165050 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164109 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:23:52.165050 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164112 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:23:52.165050 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164114 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:23:52.165050 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164117 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:23:52.165050 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164120 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:23:52.165050 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164122 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:23:52.165541 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164125 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:23:52.165541 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164128 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:23:52.165541 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164131 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:23:52.165541 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164133 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:23:52.165541 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164136 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:23:52.165541 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164138 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:23:52.165541 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164141 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:23:52.165541 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164143 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:23:52.165541 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164146 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:23:52.165541 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164148 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:23:52.165541 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164150 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:23:52.165541 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164153 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:23:52.165541 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164155 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:23:52.165541 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164158 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:23:52.165541 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164161 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:23:52.165541 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164164 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:23:52.165541 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164166 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:23:52.165541 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164168 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:23:52.165541 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164171 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:23:52.165541 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164173 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:23:52.166044 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164176 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:23:52.166044 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164178 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:23:52.166044 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164181 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:23:52.166044 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164183 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:23:52.166044 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164185 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:23:52.166044 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164188 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:23:52.166044 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164190 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:23:52.166044 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164256 2570 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 14:23:52.166044 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164263 2570 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 14:23:52.166044 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164270 2570 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 14:23:52.166044 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164274 2570 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 14:23:52.166044 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164279 2570 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 14:23:52.166044 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164282 2570 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 14:23:52.166044 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164287 2570 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 14:23:52.166044 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164291 2570 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 14:23:52.166044 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164294 2570 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 14:23:52.166044 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164297 2570 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 14:23:52.166044 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164301 2570 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 14:23:52.166044 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164318 2570 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 14:23:52.166044 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164322 2570 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 14:23:52.166044 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164325 2570 flags.go:64] FLAG: --cgroup-root="" Apr 24 14:23:52.166044 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164329 2570 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 14:23:52.166562 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164332 2570 flags.go:64] FLAG: --client-ca-file="" Apr 24 14:23:52.166562 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164335 2570 flags.go:64] FLAG: --cloud-config="" Apr 24 14:23:52.166562 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164338 2570 flags.go:64] FLAG: --cloud-provider="external" Apr 24 14:23:52.166562 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164341 2570 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 14:23:52.166562 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164345 2570 flags.go:64] FLAG: --cluster-domain="" Apr 24 14:23:52.166562 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164348 2570 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 14:23:52.166562 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164351 2570 flags.go:64] FLAG: --config-dir="" Apr 24 14:23:52.166562 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164354 2570 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 14:23:52.166562 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164358 2570 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 14:23:52.166562 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164362 2570 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 14:23:52.166562 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164365 2570 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 14:23:52.166562 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164368 2570 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 14:23:52.166562 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164371 2570 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 14:23:52.166562 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164374 2570 flags.go:64] FLAG: --contention-profiling="false" Apr 24 14:23:52.166562 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164377 2570 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 14:23:52.166562 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164380 2570 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 14:23:52.166562 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164383 2570 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 14:23:52.166562 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164386 2570 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 14:23:52.166562 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164390 2570 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 14:23:52.166562 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164393 2570 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 14:23:52.166562 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164396 2570 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 14:23:52.166562 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164398 2570 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 14:23:52.166562 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164402 2570 flags.go:64] FLAG: --enable-server="true" Apr 24 14:23:52.166562 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164405 2570 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 14:23:52.166562 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164410 2570 flags.go:64] FLAG: --event-burst="100" Apr 24 14:23:52.167174 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164413 2570 flags.go:64] FLAG: --event-qps="50" Apr 24 14:23:52.167174 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164415 2570 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 14:23:52.167174 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164418 2570 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 14:23:52.167174 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164422 2570 flags.go:64] FLAG: --eviction-hard="" Apr 24 14:23:52.167174 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164425 2570 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 14:23:52.167174 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164429 2570 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 14:23:52.167174 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164431 2570 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 14:23:52.167174 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164434 2570 flags.go:64] FLAG: --eviction-soft="" Apr 24 14:23:52.167174 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164437 2570 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 14:23:52.167174 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164441 2570 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 14:23:52.167174 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164444 2570 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 14:23:52.167174 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164447 2570 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 14:23:52.167174 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164450 2570 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 14:23:52.167174 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164452 2570 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 14:23:52.167174 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164456 2570 flags.go:64] FLAG: --feature-gates="" Apr 24 14:23:52.167174 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164462 2570 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 14:23:52.167174 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164467 2570 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 14:23:52.167174 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164471 2570 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 14:23:52.167174 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164474 2570 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 14:23:52.167174 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164477 2570 flags.go:64] FLAG: --healthz-port="10248" Apr 24 14:23:52.167174 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164480 2570 flags.go:64] FLAG: --help="false" Apr 24 14:23:52.167174 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164483 2570 flags.go:64] FLAG: --hostname-override="ip-10-0-129-77.ec2.internal" Apr 24 14:23:52.167174 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164486 2570 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 14:23:52.167174 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164489 2570 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 14:23:52.167763 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164492 2570 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 14:23:52.167763 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164495 2570 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 14:23:52.167763 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164498 2570 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 14:23:52.167763 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164501 2570 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 14:23:52.167763 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164504 2570 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 14:23:52.167763 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164506 2570 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 14:23:52.167763 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164510 2570 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 14:23:52.167763 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164524 2570 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 14:23:52.167763 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164528 2570 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 14:23:52.167763 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164531 2570 flags.go:64] FLAG: --kube-reserved="" Apr 24 14:23:52.167763 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164534 2570 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 14:23:52.167763 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164540 2570 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 14:23:52.167763 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164543 2570 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 14:23:52.167763 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164546 2570 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 14:23:52.167763 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164549 2570 flags.go:64] FLAG: --lock-file="" Apr 24 14:23:52.167763 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164552 2570 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 14:23:52.167763 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164555 2570 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 14:23:52.167763 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164558 2570 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 14:23:52.167763 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164563 2570 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 14:23:52.167763 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164566 2570 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 14:23:52.167763 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164569 2570 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 14:23:52.167763 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164571 2570 flags.go:64] FLAG: --logging-format="text" Apr 24 14:23:52.167763 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164574 2570 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 14:23:52.168310 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164578 2570 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 14:23:52.168310 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164580 2570 flags.go:64] FLAG: --manifest-url="" Apr 24 14:23:52.168310 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164583 2570 flags.go:64] FLAG: --manifest-url-header="" Apr 24 14:23:52.168310 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164588 2570 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 14:23:52.168310 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164591 2570 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 14:23:52.168310 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164595 2570 flags.go:64] FLAG: --max-pods="110" Apr 24 14:23:52.168310 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164598 2570 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 14:23:52.168310 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164601 2570 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 14:23:52.168310 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164604 2570 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 14:23:52.168310 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164607 2570 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 14:23:52.168310 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164610 2570 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 14:23:52.168310 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164613 2570 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 14:23:52.168310 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164616 2570 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 14:23:52.168310 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164638 2570 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 14:23:52.168310 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164641 2570 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 14:23:52.168310 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164645 2570 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 14:23:52.168310 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164648 2570 flags.go:64] FLAG: --pod-cidr="" Apr 24 14:23:52.168310 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164651 2570 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 14:23:52.168310 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164658 2570 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 14:23:52.168310 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164661 2570 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 14:23:52.168310 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164665 2570 flags.go:64] FLAG: --pods-per-core="0" Apr 24 14:23:52.168310 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164668 2570 flags.go:64] FLAG: --port="10250" Apr 24 14:23:52.168310 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164671 2570 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 14:23:52.168310 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164674 2570 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-09a42348fc29da64c" Apr 24 14:23:52.168929 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164677 2570 flags.go:64] FLAG: --qos-reserved="" Apr 24 14:23:52.168929 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164680 2570 flags.go:64] FLAG: --read-only-port="10255" Apr 24 14:23:52.168929 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164683 2570 flags.go:64] FLAG: --register-node="true" Apr 24 14:23:52.168929 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164686 2570 flags.go:64] FLAG: --register-schedulable="true" Apr 24 14:23:52.168929 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164688 2570 flags.go:64] FLAG: --register-with-taints="" Apr 24 14:23:52.168929 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164692 2570 flags.go:64] FLAG: --registry-burst="10" Apr 24 14:23:52.168929 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164695 2570 flags.go:64] FLAG: --registry-qps="5" Apr 24 14:23:52.168929 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164697 2570 flags.go:64] FLAG: --reserved-cpus="" Apr 24 14:23:52.168929 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164700 2570 flags.go:64] FLAG: --reserved-memory="" Apr 24 14:23:52.168929 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164704 2570 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 14:23:52.168929 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164707 2570 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 14:23:52.168929 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164710 2570 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 14:23:52.168929 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164712 2570 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 14:23:52.168929 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164715 2570 flags.go:64] FLAG: --runonce="false" Apr 24 14:23:52.168929 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164718 2570 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 14:23:52.168929 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164721 2570 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 14:23:52.168929 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164724 2570 flags.go:64] FLAG: --seccomp-default="false" Apr 24 14:23:52.168929 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164726 2570 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 14:23:52.168929 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164729 2570 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 14:23:52.168929 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164732 2570 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 14:23:52.168929 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164736 2570 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 14:23:52.168929 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164739 2570 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 14:23:52.168929 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164742 2570 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 14:23:52.168929 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164745 2570 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 14:23:52.168929 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164747 2570 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 14:23:52.168929 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164752 2570 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 14:23:52.169547 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164755 2570 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 14:23:52.169547 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164758 2570 flags.go:64] FLAG: --system-cgroups="" Apr 24 14:23:52.169547 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164762 2570 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 14:23:52.169547 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164767 2570 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 14:23:52.169547 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164770 2570 flags.go:64] FLAG: --tls-cert-file="" Apr 24 14:23:52.169547 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164773 2570 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 14:23:52.169547 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164777 2570 flags.go:64] FLAG: --tls-min-version="" Apr 24 14:23:52.169547 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164780 2570 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 14:23:52.169547 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164783 2570 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 14:23:52.169547 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164786 2570 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 14:23:52.169547 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164789 2570 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 14:23:52.169547 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164792 2570 flags.go:64] FLAG: --v="2" Apr 24 14:23:52.169547 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164796 2570 flags.go:64] FLAG: --version="false" Apr 24 14:23:52.169547 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164800 2570 flags.go:64] FLAG: --vmodule="" Apr 24 14:23:52.169547 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164804 2570 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 14:23:52.169547 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.164807 2570 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 14:23:52.169547 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164912 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:23:52.169547 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164917 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:23:52.169547 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164920 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:23:52.169547 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164923 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:23:52.169547 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164925 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:23:52.169547 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164928 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:23:52.169547 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164930 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:23:52.170106 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164932 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:23:52.170106 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164935 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:23:52.170106 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164937 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:23:52.170106 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164940 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:23:52.170106 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164943 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:23:52.170106 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164947 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:23:52.170106 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164950 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:23:52.170106 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164953 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:23:52.170106 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164956 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:23:52.170106 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164959 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:23:52.170106 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164962 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:23:52.170106 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164965 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:23:52.170106 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164968 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:23:52.170106 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164970 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:23:52.170106 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164973 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:23:52.170106 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164975 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:23:52.170106 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164978 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:23:52.170106 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164980 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:23:52.170106 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164983 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:23:52.170571 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164985 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:23:52.170571 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164987 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:23:52.170571 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164990 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:23:52.170571 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164992 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:23:52.170571 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164994 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:23:52.170571 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164997 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:23:52.170571 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.164999 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:23:52.170571 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165002 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:23:52.170571 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165004 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:23:52.170571 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165006 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:23:52.170571 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165009 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:23:52.170571 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165011 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:23:52.170571 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165014 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:23:52.170571 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165016 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:23:52.170571 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165018 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:23:52.170571 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165021 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:23:52.170571 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165023 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:23:52.170571 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165026 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:23:52.170571 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165028 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:23:52.170571 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165031 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:23:52.170571 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165033 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:23:52.171133 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165035 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:23:52.171133 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165038 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:23:52.171133 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165040 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:23:52.171133 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165044 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:23:52.171133 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165047 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:23:52.171133 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165049 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:23:52.171133 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165052 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:23:52.171133 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165055 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:23:52.171133 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165058 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:23:52.171133 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165062 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:23:52.171133 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165065 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:23:52.171133 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165068 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:23:52.171133 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165071 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:23:52.171133 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165073 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:23:52.171133 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165077 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:23:52.171133 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165079 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:23:52.171133 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165082 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:23:52.171133 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165084 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:23:52.171133 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165087 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:23:52.171606 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165089 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:23:52.171606 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165092 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:23:52.171606 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165094 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:23:52.171606 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165096 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:23:52.171606 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165099 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:23:52.171606 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165101 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:23:52.171606 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165103 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:23:52.171606 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165107 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:23:52.171606 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165110 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:23:52.171606 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165112 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:23:52.171606 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165115 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:23:52.171606 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165117 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:23:52.171606 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165120 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:23:52.171606 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165123 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:23:52.171606 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165125 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:23:52.171606 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165128 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:23:52.171606 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165132 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:23:52.171606 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165134 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:23:52.171606 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165137 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:23:52.171606 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.165139 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:23:52.172138 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.165823 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 14:23:52.172661 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.172642 2570 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 14:23:52.172696 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.172662 2570 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 14:23:52.172726 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172712 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:23:52.172726 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172717 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:23:52.172726 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172721 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:23:52.172726 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172724 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:23:52.172726 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172727 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:23:52.172854 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172731 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:23:52.172854 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172734 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:23:52.172854 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172737 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:23:52.172854 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172741 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:23:52.172854 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172746 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:23:52.172854 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172749 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:23:52.172854 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172752 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:23:52.172854 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172755 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:23:52.172854 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172759 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:23:52.172854 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172761 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:23:52.172854 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172764 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:23:52.172854 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172766 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:23:52.172854 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172769 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:23:52.172854 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172771 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:23:52.172854 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172774 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:23:52.172854 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172777 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:23:52.172854 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172779 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:23:52.172854 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172782 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:23:52.172854 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172784 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:23:52.173401 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172787 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:23:52.173401 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172790 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:23:52.173401 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172793 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:23:52.173401 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172795 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:23:52.173401 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172798 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:23:52.173401 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172800 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:23:52.173401 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172803 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:23:52.173401 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172806 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:23:52.173401 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172808 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:23:52.173401 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172811 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:23:52.173401 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172813 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:23:52.173401 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172816 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:23:52.173401 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172818 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:23:52.173401 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172821 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:23:52.173401 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172824 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:23:52.173401 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172827 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:23:52.173401 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172830 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:23:52.173401 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172833 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:23:52.173401 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172835 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:23:52.173401 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172838 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:23:52.173914 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172840 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:23:52.173914 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172843 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:23:52.173914 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172846 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:23:52.173914 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172848 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:23:52.173914 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172851 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:23:52.173914 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172853 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:23:52.173914 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172856 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:23:52.173914 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172858 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:23:52.173914 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172861 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:23:52.173914 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172864 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:23:52.173914 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172867 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:23:52.173914 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172870 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:23:52.173914 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172872 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:23:52.173914 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172875 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:23:52.173914 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172879 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:23:52.173914 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172881 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:23:52.173914 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172884 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:23:52.173914 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172886 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:23:52.173914 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172888 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:23:52.173914 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172891 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:23:52.174385 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172894 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:23:52.174385 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172896 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:23:52.174385 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172899 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:23:52.174385 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172901 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:23:52.174385 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172904 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:23:52.174385 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172906 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:23:52.174385 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172910 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:23:52.174385 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172912 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:23:52.174385 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172915 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:23:52.174385 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172917 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:23:52.174385 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172920 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:23:52.174385 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172922 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:23:52.174385 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172926 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:23:52.174385 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172930 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:23:52.174385 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172933 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:23:52.174385 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172935 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:23:52.174385 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172938 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:23:52.174385 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172940 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:23:52.174385 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172943 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:23:52.174858 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172946 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:23:52.174858 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172948 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:23:52.174858 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.172951 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:23:52.174858 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.172956 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 14:23:52.174858 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173051 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:23:52.174858 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173055 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:23:52.174858 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173058 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:23:52.174858 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173061 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:23:52.174858 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173065 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:23:52.174858 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173067 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:23:52.174858 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173070 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:23:52.174858 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173073 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:23:52.174858 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173076 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:23:52.174858 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173078 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:23:52.174858 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173084 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:23:52.174858 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173086 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:23:52.175232 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173089 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:23:52.175232 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173091 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:23:52.175232 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173094 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:23:52.175232 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173097 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:23:52.175232 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173100 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:23:52.175232 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173102 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:23:52.175232 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173105 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:23:52.175232 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173107 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:23:52.175232 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173110 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:23:52.175232 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173112 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:23:52.175232 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173115 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:23:52.175232 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173117 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:23:52.175232 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173119 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:23:52.175232 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173123 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:23:52.175232 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173126 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:23:52.175232 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173129 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:23:52.175232 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173132 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:23:52.175232 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173134 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:23:52.175232 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173137 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:23:52.175232 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173140 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:23:52.175722 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173143 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:23:52.175722 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173145 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:23:52.175722 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173148 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:23:52.175722 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173150 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:23:52.175722 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173153 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:23:52.175722 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173156 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:23:52.175722 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173158 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:23:52.175722 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173161 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:23:52.175722 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173163 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:23:52.175722 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173165 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:23:52.175722 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173168 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:23:52.175722 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173170 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:23:52.175722 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173173 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:23:52.175722 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173175 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:23:52.175722 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173177 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:23:52.175722 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173181 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:23:52.175722 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173184 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:23:52.175722 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173186 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:23:52.175722 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173189 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:23:52.176182 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173191 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:23:52.176182 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173193 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:23:52.176182 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173196 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:23:52.176182 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173198 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:23:52.176182 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173201 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:23:52.176182 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173203 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:23:52.176182 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173205 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:23:52.176182 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173208 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:23:52.176182 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173210 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:23:52.176182 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173212 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:23:52.176182 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173214 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:23:52.176182 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173217 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:23:52.176182 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173219 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:23:52.176182 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173222 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:23:52.176182 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173224 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:23:52.176182 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173226 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:23:52.176182 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173228 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:23:52.176182 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173231 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:23:52.176182 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173233 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:23:52.176182 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173236 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:23:52.176676 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173238 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:23:52.176676 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173241 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:23:52.176676 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173243 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:23:52.176676 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173245 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:23:52.176676 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173248 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:23:52.176676 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173250 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:23:52.176676 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173252 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:23:52.176676 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173255 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:23:52.176676 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173257 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:23:52.176676 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173260 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:23:52.176676 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173263 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:23:52.176676 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173266 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:23:52.176676 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173269 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:23:52.176676 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173272 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:23:52.176676 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:52.173274 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:23:52.177034 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.173279 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 14:23:52.177034 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.173947 2570 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 14:23:52.177765 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.177750 2570 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 14:23:52.178666 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.178654 2570 server.go:1019] "Starting client certificate rotation" Apr 24 14:23:52.178787 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.178769 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 14:23:52.178837 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.178822 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 14:23:52.200379 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.200357 2570 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 14:23:52.203046 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.203022 2570 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 14:23:52.223309 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.223288 2570 log.go:25] "Validated CRI v1 runtime API" Apr 24 14:23:52.228150 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.228132 2570 log.go:25] "Validated CRI v1 image API" Apr 24 14:23:52.229356 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.229331 2570 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 14:23:52.231704 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.231684 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 14:23:52.232742 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.232724 2570 fs.go:135] Filesystem UUIDs: map[0161a74a-cf60-45d9-b8af-feb4c7108e05:/dev/nvme0n1p3 51abc2ff-f733-4bd4-b679-dd7f8cd24b6e:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 24 14:23:52.232799 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.232743 2570 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 14:23:52.238311 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.238200 2570 manager.go:217] Machine: {Timestamp:2026-04-24 14:23:52.236688031 +0000 UTC m=+0.362563728 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100797 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2c255d9839365e8652da22697ed5ea SystemUUID:ec2c255d-9839-365e-8652-da22697ed5ea BootID:db204698-509d-4a7f-8664-4d531ea3507f Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:43:ec:d4:bf:e7 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:43:ec:d4:bf:e7 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:de:4f:6e:90:d8:38 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 14:23:52.238311 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.238306 2570 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 14:23:52.238428 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.238417 2570 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 14:23:52.239473 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.239448 2570 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 14:23:52.239616 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.239476 2570 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-77.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 14:23:52.239676 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.239639 2570 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 14:23:52.239676 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.239649 2570 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 14:23:52.239676 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.239661 2570 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 14:23:52.240372 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.240362 2570 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 14:23:52.241586 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.241575 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 24 14:23:52.241708 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.241699 2570 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 14:23:52.243818 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.243808 2570 kubelet.go:491] "Attempting to sync node with API server" Apr 24 14:23:52.243850 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.243829 2570 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 14:23:52.243850 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.243841 2570 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 14:23:52.243850 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.243850 2570 kubelet.go:397] "Adding apiserver pod source" Apr 24 14:23:52.243941 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.243864 2570 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 14:23:52.245008 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.244995 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 14:23:52.245044 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.245021 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 14:23:52.247843 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.247824 2570 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 14:23:52.249889 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.249875 2570 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 14:23:52.249959 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.249896 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-4p244" Apr 24 14:23:52.251123 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.251106 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 14:23:52.251218 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.251128 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 14:23:52.251218 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.251137 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 14:23:52.251218 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.251144 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 14:23:52.251218 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.251152 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 14:23:52.251218 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.251159 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 14:23:52.251218 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.251165 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 14:23:52.251218 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.251171 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 14:23:52.251218 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.251178 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 14:23:52.251218 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.251184 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 14:23:52.251218 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.251204 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 14:23:52.251218 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.251212 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 14:23:52.251871 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.251861 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 14:23:52.251871 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.251871 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 14:23:52.255463 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.255448 2570 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 14:23:52.255537 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.255485 2570 server.go:1295] "Started kubelet" Apr 24 14:23:52.255647 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.255598 2570 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 14:23:52.255729 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.255654 2570 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 14:23:52.255768 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.255751 2570 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 14:23:52.256458 ip-10-0-129-77 systemd[1]: Started Kubernetes Kubelet. Apr 24 14:23:52.260059 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.260038 2570 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 14:23:52.260348 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.260319 2570 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-77.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 14:23:52.260572 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:52.260552 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-77.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 14:23:52.260753 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:52.260723 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 14:23:52.261121 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.261094 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-4p244" Apr 24 14:23:52.262073 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.262046 2570 server.go:317] "Adding debug handlers to kubelet server" Apr 24 14:23:52.267220 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:52.267200 2570 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 14:23:52.267554 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.267525 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 14:23:52.267956 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.267944 2570 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 14:23:52.268662 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.268640 2570 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 14:23:52.268662 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.268643 2570 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 14:23:52.268807 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.268673 2570 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 14:23:52.268807 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.268752 2570 reconstruct.go:97] "Volume reconstruction finished" Apr 24 14:23:52.268807 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.268760 2570 reconciler.go:26] "Reconciler: start to sync state" Apr 24 14:23:52.268807 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:52.268768 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-77.ec2.internal\" not found" Apr 24 14:23:52.268988 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.268833 2570 factory.go:55] Registering systemd factory Apr 24 14:23:52.268988 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.268852 2570 factory.go:223] Registration of the systemd container factory successfully Apr 24 14:23:52.269073 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.269042 2570 factory.go:153] Registering CRI-O factory Apr 24 14:23:52.269073 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.269052 2570 factory.go:223] Registration of the crio container factory successfully Apr 24 14:23:52.269156 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.269092 2570 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 14:23:52.269156 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.269119 2570 factory.go:103] Registering Raw factory Apr 24 14:23:52.269237 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.269170 2570 manager.go:1196] Started watching for new ooms in manager Apr 24 14:23:52.269755 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.269740 2570 manager.go:319] Starting recovery of all containers Apr 24 14:23:52.270834 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.270815 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:23:52.273043 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:52.273022 2570 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-129-77.ec2.internal\" not found" node="ip-10-0-129-77.ec2.internal" Apr 24 14:23:52.276225 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.276189 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 14:23:52.281161 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.281146 2570 manager.go:324] Recovery completed Apr 24 14:23:52.285292 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.285279 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:23:52.288606 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.288591 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-77.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:23:52.288710 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.288640 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-77.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:23:52.288710 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.288653 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-77.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:23:52.289123 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.289099 2570 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 14:23:52.289123 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.289110 2570 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 14:23:52.289123 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.289127 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 24 14:23:52.291587 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.291575 2570 policy_none.go:49] "None policy: Start" Apr 24 14:23:52.291655 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.291590 2570 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 14:23:52.291655 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.291600 2570 state_mem.go:35] "Initializing new in-memory state store" Apr 24 14:23:52.334232 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.334216 2570 manager.go:341] "Starting Device Plugin manager" Apr 24 14:23:52.375353 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:52.334246 2570 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 14:23:52.375353 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.334258 2570 server.go:85] "Starting device plugin registration server" Apr 24 14:23:52.375353 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.334520 2570 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 14:23:52.375353 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.334534 2570 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 14:23:52.375353 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.334661 2570 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 14:23:52.375353 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.334740 2570 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 14:23:52.375353 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.334749 2570 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 14:23:52.375353 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:52.335343 2570 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 14:23:52.375353 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:52.335376 2570 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-77.ec2.internal\" not found" Apr 24 14:23:52.412928 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.412874 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 14:23:52.412928 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.412907 2570 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 14:23:52.412928 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.412926 2570 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 14:23:52.413096 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.412932 2570 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 14:23:52.413096 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:52.413003 2570 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 14:23:52.416185 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.416167 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:23:52.435101 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.435082 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:23:52.435933 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.435918 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-77.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:23:52.435998 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.435950 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-77.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:23:52.435998 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.435962 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-77.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:23:52.435998 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.435992 2570 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-77.ec2.internal" Apr 24 14:23:52.442962 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.442942 2570 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-77.ec2.internal" Apr 24 14:23:52.443051 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:52.442967 2570 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-77.ec2.internal\": node \"ip-10-0-129-77.ec2.internal\" not found" Apr 24 14:23:52.463977 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:52.463955 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-77.ec2.internal\" not found" Apr 24 14:23:52.513949 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.513919 2570 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-129-77.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-77.ec2.internal"] Apr 24 14:23:52.514036 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.514002 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:23:52.514863 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.514847 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-77.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:23:52.514950 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.514875 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-77.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:23:52.514950 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.514885 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-77.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:23:52.516086 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.516074 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:23:52.516270 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.516256 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-77.ec2.internal" Apr 24 14:23:52.516316 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.516282 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:23:52.516802 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.516784 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-77.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:23:52.516889 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.516818 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-77.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:23:52.516889 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.516836 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-77.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:23:52.516889 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.516792 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-77.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:23:52.516889 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.516889 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-77.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:23:52.517070 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.516902 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-77.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:23:52.518154 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.518137 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-77.ec2.internal" Apr 24 14:23:52.518231 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.518169 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:23:52.518813 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.518798 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-77.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:23:52.518898 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.518818 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-77.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:23:52.518898 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.518829 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-77.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:23:52.543150 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:52.543124 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-77.ec2.internal\" not found" node="ip-10-0-129-77.ec2.internal" Apr 24 14:23:52.547514 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:52.547498 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-77.ec2.internal\" not found" node="ip-10-0-129-77.ec2.internal" Apr 24 14:23:52.564471 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:52.564453 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-77.ec2.internal\" not found" Apr 24 14:23:52.570389 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.570369 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/935c2debf0f5dd0986c38f8610254d0b-config\") pod \"kube-apiserver-proxy-ip-10-0-129-77.ec2.internal\" (UID: \"935c2debf0f5dd0986c38f8610254d0b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-77.ec2.internal" Apr 24 14:23:52.570447 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.570399 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/864e358909ff6c4ae03cf379c8abd2b5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-77.ec2.internal\" (UID: \"864e358909ff6c4ae03cf379c8abd2b5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-77.ec2.internal" Apr 24 14:23:52.570447 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.570417 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/864e358909ff6c4ae03cf379c8abd2b5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-77.ec2.internal\" (UID: \"864e358909ff6c4ae03cf379c8abd2b5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-77.ec2.internal" Apr 24 14:23:52.665153 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:52.665081 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-77.ec2.internal\" not found" Apr 24 14:23:52.671417 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.671398 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/864e358909ff6c4ae03cf379c8abd2b5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-77.ec2.internal\" (UID: \"864e358909ff6c4ae03cf379c8abd2b5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-77.ec2.internal" Apr 24 14:23:52.671482 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.671426 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/935c2debf0f5dd0986c38f8610254d0b-config\") pod \"kube-apiserver-proxy-ip-10-0-129-77.ec2.internal\" (UID: \"935c2debf0f5dd0986c38f8610254d0b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-77.ec2.internal" Apr 24 14:23:52.671482 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.671472 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/864e358909ff6c4ae03cf379c8abd2b5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-77.ec2.internal\" (UID: \"864e358909ff6c4ae03cf379c8abd2b5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-77.ec2.internal" Apr 24 14:23:52.671561 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.671526 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/935c2debf0f5dd0986c38f8610254d0b-config\") pod \"kube-apiserver-proxy-ip-10-0-129-77.ec2.internal\" (UID: \"935c2debf0f5dd0986c38f8610254d0b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-77.ec2.internal" Apr 24 14:23:52.671561 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.671493 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/864e358909ff6c4ae03cf379c8abd2b5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-77.ec2.internal\" (UID: \"864e358909ff6c4ae03cf379c8abd2b5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-77.ec2.internal" Apr 24 14:23:52.671561 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.671526 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/864e358909ff6c4ae03cf379c8abd2b5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-77.ec2.internal\" (UID: \"864e358909ff6c4ae03cf379c8abd2b5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-77.ec2.internal" Apr 24 14:23:52.765898 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:52.765852 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-77.ec2.internal\" not found" Apr 24 14:23:52.845404 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.845368 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-77.ec2.internal" Apr 24 14:23:52.849832 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:52.849815 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-77.ec2.internal" Apr 24 14:23:52.866915 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:52.866879 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-77.ec2.internal\" not found" Apr 24 14:23:52.967416 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:52.967327 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-77.ec2.internal\" not found" Apr 24 14:23:53.067886 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:53.067849 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-77.ec2.internal\" not found" Apr 24 14:23:53.168395 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:53.168353 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-77.ec2.internal\" not found" Apr 24 14:23:53.178778 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:53.178749 2570 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 14:23:53.178951 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:53.178932 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 14:23:53.179010 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:53.178952 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 14:23:53.256787 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:53.256539 2570 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:23:53.260712 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:53.260690 2570 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:23:53.263927 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:53.263901 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 14:18:52 +0000 UTC" deadline="2028-01-14 07:31:37.629950566 +0000 UTC" Apr 24 14:23:53.263927 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:53.263926 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15113h7m44.366026985s" Apr 24 14:23:53.267812 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:53.267789 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 14:23:53.268946 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:53.268929 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-77.ec2.internal" Apr 24 14:23:53.276739 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:53.276722 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 14:23:53.277418 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:53.277407 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-77.ec2.internal" Apr 24 14:23:53.284512 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:53.284494 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 14:23:53.287404 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:53.287391 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 14:23:53.299907 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:53.299888 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-77xch" Apr 24 14:23:53.308042 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:53.308026 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-77xch" Apr 24 14:23:53.445243 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:53.445210 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod864e358909ff6c4ae03cf379c8abd2b5.slice/crio-3a839d14956beae01fd12298dd7fdc5aaecf9c80b14a0ef0dbe5460f8d448339 WatchSource:0}: Error finding container 3a839d14956beae01fd12298dd7fdc5aaecf9c80b14a0ef0dbe5460f8d448339: Status 404 returned error can't find the container with id 3a839d14956beae01fd12298dd7fdc5aaecf9c80b14a0ef0dbe5460f8d448339 Apr 24 14:23:53.448816 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:53.448798 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 14:23:53.568420 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:53.568389 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod935c2debf0f5dd0986c38f8610254d0b.slice/crio-36b6509f8cb040cfacda8c1d02be64052dc74b375babf5c9750f7e15db8a738e WatchSource:0}: Error finding container 36b6509f8cb040cfacda8c1d02be64052dc74b375babf5c9750f7e15db8a738e: Status 404 returned error can't find the container with id 36b6509f8cb040cfacda8c1d02be64052dc74b375babf5c9750f7e15db8a738e Apr 24 14:23:54.245590 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.245562 2570 apiserver.go:52] "Watching apiserver" Apr 24 14:23:54.254819 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.254798 2570 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 14:23:54.257105 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.256936 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-v5zt2","openshift-cluster-node-tuning-operator/tuned-52ssx","openshift-dns/node-resolver-7l5sm","openshift-image-registry/node-ca-629tj","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-77.ec2.internal","openshift-ovn-kubernetes/ovnkube-node-pbjt7","kube-system/konnectivity-agent-zrhjw","kube-system/kube-apiserver-proxy-ip-10-0-129-77.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lp66z","openshift-multus/multus-additional-cni-plugins-96scn","openshift-multus/multus-qhq2p","openshift-multus/network-metrics-daemon-f5bf4","openshift-network-diagnostics/network-check-target-fkvkp"] Apr 24 14:23:54.258522 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.258497 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-zrhjw" Apr 24 14:23:54.261153 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.260904 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.261810 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.261435 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-wmrp2\"" Apr 24 14:23:54.261810 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.261544 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 14:23:54.261810 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.261685 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 14:23:54.262149 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.262130 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7l5sm" Apr 24 14:23:54.262225 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.262211 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-629tj" Apr 24 14:23:54.264204 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.263934 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 14:23:54.264204 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.264048 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-mbtvc\"" Apr 24 14:23:54.264324 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.264262 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 14:23:54.265157 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.265101 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.266613 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.265999 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 14:23:54.266613 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.266197 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 14:23:54.266613 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.266370 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 14:23:54.266613 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.266526 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-hr775\"" Apr 24 14:23:54.266613 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.266576 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 14:23:54.267041 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.266661 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 14:23:54.267041 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.266732 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-szvkl\"" Apr 24 14:23:54.267750 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.267412 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-v5zt2" Apr 24 14:23:54.267750 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.267462 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lp66z" Apr 24 14:23:54.267750 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.267547 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-96scn" Apr 24 14:23:54.269410 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.269388 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 14:23:54.269723 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.269704 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 14:23:54.272879 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.272859 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 14:23:54.273023 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.272910 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-l5f74\"" Apr 24 14:23:54.273023 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.272985 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 14:23:54.273150 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.273133 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 14:23:54.273245 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.273167 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 14:23:54.273245 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.273219 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 14:23:54.274839 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.274821 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 14:23:54.275012 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.274996 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 14:23:54.275171 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.275157 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-mjlgd\"" Apr 24 14:23:54.275321 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.275309 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 14:23:54.275706 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.275690 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 14:23:54.276713 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.276693 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 14:23:54.276929 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.276910 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-66tkf\"" Apr 24 14:23:54.277001 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.276936 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f5bf4" Apr 24 14:23:54.277055 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:54.277012 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f5bf4" podUID="90958440-ae13-4f74-8dc0-73b738f79139" Apr 24 14:23:54.278760 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.278740 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fkvkp" Apr 24 14:23:54.278856 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:54.278799 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fkvkp" podUID="3640d87a-9a53-41b1-912e-39a56479c86c" Apr 24 14:23:54.278921 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.278861 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.279208 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.279189 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 14:23:54.279319 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.279303 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-kv8sp\"" Apr 24 14:23:54.279832 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.279520 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 14:23:54.279832 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.279640 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 14:23:54.279832 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.279748 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 14:23:54.279832 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.279810 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 14:23:54.280369 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.280348 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-host-run-netns\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.280444 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.280383 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-var-lib-openvswitch\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.280444 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.280408 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-run-openvswitch\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.280444 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.280430 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-host-run-netns\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.280605 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.280521 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/65b901d7-b766-4277-901d-fd586db8de46-socket-dir\") pod \"aws-ebs-csi-driver-node-lp66z\" (UID: \"65b901d7-b766-4277-901d-fd586db8de46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lp66z" Apr 24 14:23:54.280605 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.280578 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/03c62f6c-4b55-4a95-82a9-797e22c98930-tuning-conf-dir\") pod \"multus-additional-cni-plugins-96scn\" (UID: \"03c62f6c-4b55-4a95-82a9-797e22c98930\") " pod="openshift-multus/multus-additional-cni-plugins-96scn" Apr 24 14:23:54.280747 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.280608 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/77cbed1f-c18d-4a43-bcc3-230e23453a72-ovnkube-config\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.280747 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.280662 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-multus-cni-dir\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.280747 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.280688 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-host-var-lib-cni-multus\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.280747 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.280723 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-etc-kubernetes\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.280747 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.280744 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/65b901d7-b766-4277-901d-fd586db8de46-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lp66z\" (UID: \"65b901d7-b766-4277-901d-fd586db8de46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lp66z" Apr 24 14:23:54.280902 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.280766 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0ebb20a0-f56e-442b-8216-34313a45e74c-agent-certs\") pod \"konnectivity-agent-zrhjw\" (UID: \"0ebb20a0-f56e-442b-8216-34313a45e74c\") " pod="kube-system/konnectivity-agent-zrhjw" Apr 24 14:23:54.280902 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.280789 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-etc-tuned\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.280902 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.280812 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwntb\" (UniqueName: \"kubernetes.io/projected/9bc5ec31-bf4b-46de-abb2-21a96bd6160a-kube-api-access-xwntb\") pod \"node-ca-629tj\" (UID: \"9bc5ec31-bf4b-46de-abb2-21a96bd6160a\") " pod="openshift-image-registry/node-ca-629tj" Apr 24 14:23:54.280902 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.280839 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/03c62f6c-4b55-4a95-82a9-797e22c98930-os-release\") pod \"multus-additional-cni-plugins-96scn\" (UID: \"03c62f6c-4b55-4a95-82a9-797e22c98930\") " pod="openshift-multus/multus-additional-cni-plugins-96scn" Apr 24 14:23:54.280902 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.280866 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/03c62f6c-4b55-4a95-82a9-797e22c98930-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-96scn\" (UID: \"03c62f6c-4b55-4a95-82a9-797e22c98930\") " pod="openshift-multus/multus-additional-cni-plugins-96scn" Apr 24 14:23:54.281077 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.280887 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-etc-systemd\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.281077 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.280930 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-log-socket\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.281077 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.280954 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-multus-socket-dir-parent\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.281077 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.280985 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-host-var-lib-cni-bin\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.281077 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.281007 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-multus-daemon-config\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.281077 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.281032 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctskh\" (UniqueName: \"kubernetes.io/projected/67732aa7-95a9-4a96-9e40-a2a525b77a52-kube-api-access-ctskh\") pod \"node-resolver-7l5sm\" (UID: \"67732aa7-95a9-4a96-9e40-a2a525b77a52\") " pod="openshift-dns/node-resolver-7l5sm" Apr 24 14:23:54.281077 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.281065 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvc9d\" (UniqueName: \"kubernetes.io/projected/65b901d7-b766-4277-901d-fd586db8de46-kube-api-access-mvc9d\") pod \"aws-ebs-csi-driver-node-lp66z\" (UID: \"65b901d7-b766-4277-901d-fd586db8de46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lp66z" Apr 24 14:23:54.281314 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.281099 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-var-lib-kubelet\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.281314 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.281168 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/562db2e5-0cd5-452e-9208-ceb438b32453-iptables-alerter-script\") pod \"iptables-alerter-v5zt2\" (UID: \"562db2e5-0cd5-452e-9208-ceb438b32453\") " pod="openshift-network-operator/iptables-alerter-v5zt2" Apr 24 14:23:54.281314 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.281197 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/67732aa7-95a9-4a96-9e40-a2a525b77a52-hosts-file\") pod \"node-resolver-7l5sm\" (UID: \"67732aa7-95a9-4a96-9e40-a2a525b77a52\") " pod="openshift-dns/node-resolver-7l5sm" Apr 24 14:23:54.281314 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.281227 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-host-cni-netd\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.281314 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.281252 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/77cbed1f-c18d-4a43-bcc3-230e23453a72-env-overrides\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.281314 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.281276 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-os-release\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.281314 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.281300 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/65b901d7-b766-4277-901d-fd586db8de46-registration-dir\") pod \"aws-ebs-csi-driver-node-lp66z\" (UID: \"65b901d7-b766-4277-901d-fd586db8de46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lp66z" Apr 24 14:23:54.281538 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.281324 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-host\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.281538 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.281348 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ltcb\" (UniqueName: \"kubernetes.io/projected/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-kube-api-access-8ltcb\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.281538 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.281370 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9bc5ec31-bf4b-46de-abb2-21a96bd6160a-host\") pod \"node-ca-629tj\" (UID: \"9bc5ec31-bf4b-46de-abb2-21a96bd6160a\") " pod="openshift-image-registry/node-ca-629tj" Apr 24 14:23:54.281538 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.281392 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-host-slash\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.281538 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.281416 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-run-ovn\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.281538 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.281439 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-host-run-ovn-kubernetes\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.281538 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.281474 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlqth\" (UniqueName: \"kubernetes.io/projected/562db2e5-0cd5-452e-9208-ceb438b32453-kube-api-access-zlqth\") pod \"iptables-alerter-v5zt2\" (UID: \"562db2e5-0cd5-452e-9208-ceb438b32453\") " pod="openshift-network-operator/iptables-alerter-v5zt2" Apr 24 14:23:54.281538 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.281498 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/03c62f6c-4b55-4a95-82a9-797e22c98930-system-cni-dir\") pod \"multus-additional-cni-plugins-96scn\" (UID: \"03c62f6c-4b55-4a95-82a9-797e22c98930\") " pod="openshift-multus/multus-additional-cni-plugins-96scn" Apr 24 14:23:54.281538 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.281521 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds59v\" (UniqueName: \"kubernetes.io/projected/03c62f6c-4b55-4a95-82a9-797e22c98930-kube-api-access-ds59v\") pod \"multus-additional-cni-plugins-96scn\" (UID: \"03c62f6c-4b55-4a95-82a9-797e22c98930\") " pod="openshift-multus/multus-additional-cni-plugins-96scn" Apr 24 14:23:54.281978 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.281546 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-etc-sysctl-d\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.281978 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.281567 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-node-log\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.281978 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.281589 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/562db2e5-0cd5-452e-9208-ceb438b32453-host-slash\") pod \"iptables-alerter-v5zt2\" (UID: \"562db2e5-0cd5-452e-9208-ceb438b32453\") " pod="openshift-network-operator/iptables-alerter-v5zt2" Apr 24 14:23:54.281978 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.281610 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/65b901d7-b766-4277-901d-fd586db8de46-sys-fs\") pod \"aws-ebs-csi-driver-node-lp66z\" (UID: \"65b901d7-b766-4277-901d-fd586db8de46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lp66z" Apr 24 14:23:54.281978 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.281676 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 14:23:54.281978 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.281712 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.281978 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.281737 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/03c62f6c-4b55-4a95-82a9-797e22c98930-cnibin\") pod \"multus-additional-cni-plugins-96scn\" (UID: \"03c62f6c-4b55-4a95-82a9-797e22c98930\") " pod="openshift-multus/multus-additional-cni-plugins-96scn" Apr 24 14:23:54.281978 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.281758 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-lib-modules\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.281978 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.281780 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-tmp\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.281978 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.281801 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-system-cni-dir\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.281978 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.281824 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/77cbed1f-c18d-4a43-bcc3-230e23453a72-ovn-node-metrics-cert\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.281978 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.281845 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/77cbed1f-c18d-4a43-bcc3-230e23453a72-ovnkube-script-lib\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.281978 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.281867 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-cni-binary-copy\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.281978 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.281888 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-sys\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.281978 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.281911 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-host-kubelet\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.281978 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.281934 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-etc-openvswitch\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.281978 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.281955 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27wzg\" (UniqueName: \"kubernetes.io/projected/77cbed1f-c18d-4a43-bcc3-230e23453a72-kube-api-access-27wzg\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.282835 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.281979 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-etc-sysctl-conf\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.282835 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.282000 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-cnibin\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.282835 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.282024 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/03c62f6c-4b55-4a95-82a9-797e22c98930-cni-binary-copy\") pod \"multus-additional-cni-plugins-96scn\" (UID: \"03c62f6c-4b55-4a95-82a9-797e22c98930\") " pod="openshift-multus/multus-additional-cni-plugins-96scn" Apr 24 14:23:54.282835 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.282065 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0ebb20a0-f56e-442b-8216-34313a45e74c-konnectivity-ca\") pod \"konnectivity-agent-zrhjw\" (UID: \"0ebb20a0-f56e-442b-8216-34313a45e74c\") " pod="kube-system/konnectivity-agent-zrhjw" Apr 24 14:23:54.282835 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.282090 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-run-systemd\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.282835 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.282116 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-host-var-lib-kubelet\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.282835 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.282171 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-tjq9f\"" Apr 24 14:23:54.282835 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.282164 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-multus-conf-dir\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.282835 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.282230 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/67732aa7-95a9-4a96-9e40-a2a525b77a52-tmp-dir\") pod \"node-resolver-7l5sm\" (UID: \"67732aa7-95a9-4a96-9e40-a2a525b77a52\") " pod="openshift-dns/node-resolver-7l5sm" Apr 24 14:23:54.282835 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.282261 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/65b901d7-b766-4277-901d-fd586db8de46-device-dir\") pod \"aws-ebs-csi-driver-node-lp66z\" (UID: \"65b901d7-b766-4277-901d-fd586db8de46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lp66z" Apr 24 14:23:54.282835 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.282290 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/03c62f6c-4b55-4a95-82a9-797e22c98930-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-96scn\" (UID: \"03c62f6c-4b55-4a95-82a9-797e22c98930\") " pod="openshift-multus/multus-additional-cni-plugins-96scn" Apr 24 14:23:54.282835 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.282317 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-run\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.282835 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.282355 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-systemd-units\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.282835 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.282413 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-host-cni-bin\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.282835 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.282438 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/65b901d7-b766-4277-901d-fd586db8de46-etc-selinux\") pod \"aws-ebs-csi-driver-node-lp66z\" (UID: \"65b901d7-b766-4277-901d-fd586db8de46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lp66z" Apr 24 14:23:54.282835 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.282462 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-etc-kubernetes\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.282835 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.282486 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-host-run-k8s-cni-cncf-io\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.283705 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.282509 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-host-run-multus-certs\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.283705 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.282535 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-etc-modprobe-d\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.283705 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.282556 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9bc5ec31-bf4b-46de-abb2-21a96bd6160a-serviceca\") pod \"node-ca-629tj\" (UID: \"9bc5ec31-bf4b-46de-abb2-21a96bd6160a\") " pod="openshift-image-registry/node-ca-629tj" Apr 24 14:23:54.283705 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.282578 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-hostroot\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.283705 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.282636 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-685dx\" (UniqueName: \"kubernetes.io/projected/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-kube-api-access-685dx\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.283705 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.282662 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-etc-sysconfig\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.311000 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.310972 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 14:18:53 +0000 UTC" deadline="2027-09-30 07:47:25.926217269 +0000 UTC" Apr 24 14:23:54.311000 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.310997 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12569h23m31.615222753s" Apr 24 14:23:54.343514 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.343485 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:23:54.370139 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.370113 2570 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 14:23:54.383534 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.383512 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-27wzg\" (UniqueName: \"kubernetes.io/projected/77cbed1f-c18d-4a43-bcc3-230e23453a72-kube-api-access-27wzg\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.383675 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.383546 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-etc-sysctl-conf\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.383675 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.383582 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c72r\" (UniqueName: \"kubernetes.io/projected/3640d87a-9a53-41b1-912e-39a56479c86c-kube-api-access-7c72r\") pod \"network-check-target-fkvkp\" (UID: \"3640d87a-9a53-41b1-912e-39a56479c86c\") " pod="openshift-network-diagnostics/network-check-target-fkvkp" Apr 24 14:23:54.383675 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.383609 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-cnibin\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.383833 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.383671 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/03c62f6c-4b55-4a95-82a9-797e22c98930-cni-binary-copy\") pod \"multus-additional-cni-plugins-96scn\" (UID: \"03c62f6c-4b55-4a95-82a9-797e22c98930\") " pod="openshift-multus/multus-additional-cni-plugins-96scn" Apr 24 14:23:54.383833 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.383685 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-cnibin\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.383833 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.383708 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0ebb20a0-f56e-442b-8216-34313a45e74c-konnectivity-ca\") pod \"konnectivity-agent-zrhjw\" (UID: \"0ebb20a0-f56e-442b-8216-34313a45e74c\") " pod="kube-system/konnectivity-agent-zrhjw" Apr 24 14:23:54.383833 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.383734 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-run-systemd\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.383833 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.383744 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-etc-sysctl-conf\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.383833 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.383760 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-host-var-lib-kubelet\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.383833 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.383787 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-multus-conf-dir\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.383833 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.383809 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/67732aa7-95a9-4a96-9e40-a2a525b77a52-tmp-dir\") pod \"node-resolver-7l5sm\" (UID: \"67732aa7-95a9-4a96-9e40-a2a525b77a52\") " pod="openshift-dns/node-resolver-7l5sm" Apr 24 14:23:54.384201 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.383833 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/65b901d7-b766-4277-901d-fd586db8de46-device-dir\") pod \"aws-ebs-csi-driver-node-lp66z\" (UID: \"65b901d7-b766-4277-901d-fd586db8de46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lp66z" Apr 24 14:23:54.384201 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.383865 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/03c62f6c-4b55-4a95-82a9-797e22c98930-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-96scn\" (UID: \"03c62f6c-4b55-4a95-82a9-797e22c98930\") " pod="openshift-multus/multus-additional-cni-plugins-96scn" Apr 24 14:23:54.384201 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.383889 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-run\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.384201 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.383911 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-systemd-units\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.384201 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.383936 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-host-cni-bin\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.384201 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.383963 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/65b901d7-b766-4277-901d-fd586db8de46-etc-selinux\") pod \"aws-ebs-csi-driver-node-lp66z\" (UID: \"65b901d7-b766-4277-901d-fd586db8de46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lp66z" Apr 24 14:23:54.384201 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.383985 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-etc-kubernetes\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.384201 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384035 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pw9t\" (UniqueName: \"kubernetes.io/projected/90958440-ae13-4f74-8dc0-73b738f79139-kube-api-access-7pw9t\") pod \"network-metrics-daemon-f5bf4\" (UID: \"90958440-ae13-4f74-8dc0-73b738f79139\") " pod="openshift-multus/network-metrics-daemon-f5bf4" Apr 24 14:23:54.384201 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384062 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-host-run-k8s-cni-cncf-io\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.384201 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384091 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-host-run-multus-certs\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.384201 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384116 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-etc-modprobe-d\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.384201 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384141 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9bc5ec31-bf4b-46de-abb2-21a96bd6160a-serviceca\") pod \"node-ca-629tj\" (UID: \"9bc5ec31-bf4b-46de-abb2-21a96bd6160a\") " pod="openshift-image-registry/node-ca-629tj" Apr 24 14:23:54.384201 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384161 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/03c62f6c-4b55-4a95-82a9-797e22c98930-cni-binary-copy\") pod \"multus-additional-cni-plugins-96scn\" (UID: \"03c62f6c-4b55-4a95-82a9-797e22c98930\") " pod="openshift-multus/multus-additional-cni-plugins-96scn" Apr 24 14:23:54.384201 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384163 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-hostroot\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.384201 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384206 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-hostroot\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.384963 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384209 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-685dx\" (UniqueName: \"kubernetes.io/projected/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-kube-api-access-685dx\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.384963 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384252 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/67732aa7-95a9-4a96-9e40-a2a525b77a52-tmp-dir\") pod \"node-resolver-7l5sm\" (UID: \"67732aa7-95a9-4a96-9e40-a2a525b77a52\") " pod="openshift-dns/node-resolver-7l5sm" Apr 24 14:23:54.384963 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384262 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-etc-sysconfig\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.384963 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384300 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-etc-sysconfig\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.384963 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384328 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-host-run-netns\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.384963 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384342 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/65b901d7-b766-4277-901d-fd586db8de46-etc-selinux\") pod \"aws-ebs-csi-driver-node-lp66z\" (UID: \"65b901d7-b766-4277-901d-fd586db8de46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lp66z" Apr 24 14:23:54.384963 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384358 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-var-lib-openvswitch\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.384963 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384368 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-etc-kubernetes\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.384963 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384382 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-host-var-lib-kubelet\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.384963 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384387 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-run-openvswitch\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.384963 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384374 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-run-systemd\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.384963 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384413 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-host-run-netns\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.384963 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384432 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-host-run-multus-certs\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.384963 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384444 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/65b901d7-b766-4277-901d-fd586db8de46-socket-dir\") pod \"aws-ebs-csi-driver-node-lp66z\" (UID: \"65b901d7-b766-4277-901d-fd586db8de46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lp66z" Apr 24 14:23:54.384963 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384446 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-run\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.384963 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384482 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-systemd-units\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.384963 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384507 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-multus-conf-dir\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.384963 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384513 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/03c62f6c-4b55-4a95-82a9-797e22c98930-tuning-conf-dir\") pod \"multus-additional-cni-plugins-96scn\" (UID: \"03c62f6c-4b55-4a95-82a9-797e22c98930\") " pod="openshift-multus/multus-additional-cni-plugins-96scn" Apr 24 14:23:54.385826 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384290 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0ebb20a0-f56e-442b-8216-34313a45e74c-konnectivity-ca\") pod \"konnectivity-agent-zrhjw\" (UID: \"0ebb20a0-f56e-442b-8216-34313a45e74c\") " pod="kube-system/konnectivity-agent-zrhjw" Apr 24 14:23:54.385826 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384535 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-etc-modprobe-d\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.385826 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384543 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/77cbed1f-c18d-4a43-bcc3-230e23453a72-ovnkube-config\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.385826 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384556 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-host-run-netns\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.385826 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384570 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-multus-cni-dir\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.385826 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384589 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-host-run-k8s-cni-cncf-io\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.385826 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384595 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-host-var-lib-cni-multus\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.385826 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384643 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-etc-kubernetes\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.385826 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384667 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/65b901d7-b766-4277-901d-fd586db8de46-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lp66z\" (UID: \"65b901d7-b766-4277-901d-fd586db8de46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lp66z" Apr 24 14:23:54.385826 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384673 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/03c62f6c-4b55-4a95-82a9-797e22c98930-tuning-conf-dir\") pod \"multus-additional-cni-plugins-96scn\" (UID: \"03c62f6c-4b55-4a95-82a9-797e22c98930\") " pod="openshift-multus/multus-additional-cni-plugins-96scn" Apr 24 14:23:54.385826 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384694 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0ebb20a0-f56e-442b-8216-34313a45e74c-agent-certs\") pod \"konnectivity-agent-zrhjw\" (UID: \"0ebb20a0-f56e-442b-8216-34313a45e74c\") " pod="kube-system/konnectivity-agent-zrhjw" Apr 24 14:23:54.385826 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384717 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-etc-tuned\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.385826 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384725 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-host-cni-bin\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.385826 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384742 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwntb\" (UniqueName: \"kubernetes.io/projected/9bc5ec31-bf4b-46de-abb2-21a96bd6160a-kube-api-access-xwntb\") pod \"node-ca-629tj\" (UID: \"9bc5ec31-bf4b-46de-abb2-21a96bd6160a\") " pod="openshift-image-registry/node-ca-629tj" Apr 24 14:23:54.385826 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384761 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-etc-kubernetes\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.385826 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384766 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/03c62f6c-4b55-4a95-82a9-797e22c98930-os-release\") pod \"multus-additional-cni-plugins-96scn\" (UID: \"03c62f6c-4b55-4a95-82a9-797e22c98930\") " pod="openshift-multus/multus-additional-cni-plugins-96scn" Apr 24 14:23:54.385826 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384793 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/03c62f6c-4b55-4a95-82a9-797e22c98930-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-96scn\" (UID: \"03c62f6c-4b55-4a95-82a9-797e22c98930\") " pod="openshift-multus/multus-additional-cni-plugins-96scn" Apr 24 14:23:54.386759 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384815 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-multus-cni-dir\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.386759 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384820 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-etc-systemd\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.386759 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384835 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/65b901d7-b766-4277-901d-fd586db8de46-device-dir\") pod \"aws-ebs-csi-driver-node-lp66z\" (UID: \"65b901d7-b766-4277-901d-fd586db8de46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lp66z" Apr 24 14:23:54.386759 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384865 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-host-var-lib-cni-multus\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.386759 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384869 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-var-lib-openvswitch\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.386759 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384868 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-etc-systemd\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.386759 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384886 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-log-socket\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.386759 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384916 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-run-openvswitch\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.386759 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384919 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-multus-socket-dir-parent\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.386759 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384934 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/65b901d7-b766-4277-901d-fd586db8de46-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lp66z\" (UID: \"65b901d7-b766-4277-901d-fd586db8de46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lp66z" Apr 24 14:23:54.386759 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384948 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-host-var-lib-cni-bin\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.386759 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384957 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-multus-socket-dir-parent\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.386759 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384984 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-host-var-lib-cni-bin\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.386759 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.384797 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9bc5ec31-bf4b-46de-abb2-21a96bd6160a-serviceca\") pod \"node-ca-629tj\" (UID: \"9bc5ec31-bf4b-46de-abb2-21a96bd6160a\") " pod="openshift-image-registry/node-ca-629tj" Apr 24 14:23:54.386759 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.385029 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/03c62f6c-4b55-4a95-82a9-797e22c98930-os-release\") pod \"multus-additional-cni-plugins-96scn\" (UID: \"03c62f6c-4b55-4a95-82a9-797e22c98930\") " pod="openshift-multus/multus-additional-cni-plugins-96scn" Apr 24 14:23:54.386759 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.385049 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/03c62f6c-4b55-4a95-82a9-797e22c98930-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-96scn\" (UID: \"03c62f6c-4b55-4a95-82a9-797e22c98930\") " pod="openshift-multus/multus-additional-cni-plugins-96scn" Apr 24 14:23:54.386759 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.385125 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/77cbed1f-c18d-4a43-bcc3-230e23453a72-ovnkube-config\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.386759 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.385175 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-log-socket\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.387550 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.385288 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-host-run-netns\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.387550 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.385307 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/65b901d7-b766-4277-901d-fd586db8de46-socket-dir\") pod \"aws-ebs-csi-driver-node-lp66z\" (UID: \"65b901d7-b766-4277-901d-fd586db8de46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lp66z" Apr 24 14:23:54.387550 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.385317 2570 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 14:23:54.387550 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.385411 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-multus-daemon-config\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.387550 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.385447 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctskh\" (UniqueName: \"kubernetes.io/projected/67732aa7-95a9-4a96-9e40-a2a525b77a52-kube-api-access-ctskh\") pod \"node-resolver-7l5sm\" (UID: \"67732aa7-95a9-4a96-9e40-a2a525b77a52\") " pod="openshift-dns/node-resolver-7l5sm" Apr 24 14:23:54.387550 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.385419 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/03c62f6c-4b55-4a95-82a9-797e22c98930-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-96scn\" (UID: \"03c62f6c-4b55-4a95-82a9-797e22c98930\") " pod="openshift-multus/multus-additional-cni-plugins-96scn" Apr 24 14:23:54.387550 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.385475 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mvc9d\" (UniqueName: \"kubernetes.io/projected/65b901d7-b766-4277-901d-fd586db8de46-kube-api-access-mvc9d\") pod \"aws-ebs-csi-driver-node-lp66z\" (UID: \"65b901d7-b766-4277-901d-fd586db8de46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lp66z" Apr 24 14:23:54.387550 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.385526 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-var-lib-kubelet\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.387550 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.385554 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/562db2e5-0cd5-452e-9208-ceb438b32453-iptables-alerter-script\") pod \"iptables-alerter-v5zt2\" (UID: \"562db2e5-0cd5-452e-9208-ceb438b32453\") " pod="openshift-network-operator/iptables-alerter-v5zt2" Apr 24 14:23:54.387550 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.385600 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/67732aa7-95a9-4a96-9e40-a2a525b77a52-hosts-file\") pod \"node-resolver-7l5sm\" (UID: \"67732aa7-95a9-4a96-9e40-a2a525b77a52\") " pod="openshift-dns/node-resolver-7l5sm" Apr 24 14:23:54.387550 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.385644 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-host-cni-netd\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.387550 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.385671 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/77cbed1f-c18d-4a43-bcc3-230e23453a72-env-overrides\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.387550 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.385695 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-os-release\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.387550 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.385715 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/65b901d7-b766-4277-901d-fd586db8de46-registration-dir\") pod \"aws-ebs-csi-driver-node-lp66z\" (UID: \"65b901d7-b766-4277-901d-fd586db8de46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lp66z" Apr 24 14:23:54.387550 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.385729 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-host\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.387550 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.385744 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8ltcb\" (UniqueName: \"kubernetes.io/projected/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-kube-api-access-8ltcb\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.387550 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.385759 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9bc5ec31-bf4b-46de-abb2-21a96bd6160a-host\") pod \"node-ca-629tj\" (UID: \"9bc5ec31-bf4b-46de-abb2-21a96bd6160a\") " pod="openshift-image-registry/node-ca-629tj" Apr 24 14:23:54.387550 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.385781 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-host-slash\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.388393 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.385803 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-run-ovn\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.388393 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.385821 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-host-run-ovn-kubernetes\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.388393 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.385844 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zlqth\" (UniqueName: \"kubernetes.io/projected/562db2e5-0cd5-452e-9208-ceb438b32453-kube-api-access-zlqth\") pod \"iptables-alerter-v5zt2\" (UID: \"562db2e5-0cd5-452e-9208-ceb438b32453\") " pod="openshift-network-operator/iptables-alerter-v5zt2" Apr 24 14:23:54.388393 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.385874 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/03c62f6c-4b55-4a95-82a9-797e22c98930-system-cni-dir\") pod \"multus-additional-cni-plugins-96scn\" (UID: \"03c62f6c-4b55-4a95-82a9-797e22c98930\") " pod="openshift-multus/multus-additional-cni-plugins-96scn" Apr 24 14:23:54.388393 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.385896 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-multus-daemon-config\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.388393 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.385967 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/65b901d7-b766-4277-901d-fd586db8de46-registration-dir\") pod \"aws-ebs-csi-driver-node-lp66z\" (UID: \"65b901d7-b766-4277-901d-fd586db8de46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lp66z" Apr 24 14:23:54.388393 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386012 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-host\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.388393 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.385897 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ds59v\" (UniqueName: \"kubernetes.io/projected/03c62f6c-4b55-4a95-82a9-797e22c98930-kube-api-access-ds59v\") pod \"multus-additional-cni-plugins-96scn\" (UID: \"03c62f6c-4b55-4a95-82a9-797e22c98930\") " pod="openshift-multus/multus-additional-cni-plugins-96scn" Apr 24 14:23:54.388393 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386093 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/77cbed1f-c18d-4a43-bcc3-230e23453a72-env-overrides\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.388393 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386112 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-etc-sysctl-d\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.388393 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386136 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-node-log\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.388393 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386151 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-var-lib-kubelet\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.388393 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386156 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/562db2e5-0cd5-452e-9208-ceb438b32453-host-slash\") pod \"iptables-alerter-v5zt2\" (UID: \"562db2e5-0cd5-452e-9208-ceb438b32453\") " pod="openshift-network-operator/iptables-alerter-v5zt2" Apr 24 14:23:54.388393 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386186 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9bc5ec31-bf4b-46de-abb2-21a96bd6160a-host\") pod \"node-ca-629tj\" (UID: \"9bc5ec31-bf4b-46de-abb2-21a96bd6160a\") " pod="openshift-image-registry/node-ca-629tj" Apr 24 14:23:54.388393 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386187 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/65b901d7-b766-4277-901d-fd586db8de46-sys-fs\") pod \"aws-ebs-csi-driver-node-lp66z\" (UID: \"65b901d7-b766-4277-901d-fd586db8de46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lp66z" Apr 24 14:23:54.388393 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386216 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-os-release\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.388393 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386227 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/65b901d7-b766-4277-901d-fd586db8de46-sys-fs\") pod \"aws-ebs-csi-driver-node-lp66z\" (UID: \"65b901d7-b766-4277-901d-fd586db8de46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lp66z" Apr 24 14:23:54.389149 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386230 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.389149 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386256 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/03c62f6c-4b55-4a95-82a9-797e22c98930-cnibin\") pod \"multus-additional-cni-plugins-96scn\" (UID: \"03c62f6c-4b55-4a95-82a9-797e22c98930\") " pod="openshift-multus/multus-additional-cni-plugins-96scn" Apr 24 14:23:54.389149 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386262 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-node-log\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.389149 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386272 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-lib-modules\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.389149 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386287 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-tmp\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.389149 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386295 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.389149 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386305 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-system-cni-dir\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.389149 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386332 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/77cbed1f-c18d-4a43-bcc3-230e23453a72-ovn-node-metrics-cert\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.389149 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386351 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/03c62f6c-4b55-4a95-82a9-797e22c98930-system-cni-dir\") pod \"multus-additional-cni-plugins-96scn\" (UID: \"03c62f6c-4b55-4a95-82a9-797e22c98930\") " pod="openshift-multus/multus-additional-cni-plugins-96scn" Apr 24 14:23:54.389149 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386357 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/77cbed1f-c18d-4a43-bcc3-230e23453a72-ovnkube-script-lib\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.389149 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386391 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-cni-binary-copy\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.389149 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386415 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-sys\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.389149 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386446 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90958440-ae13-4f74-8dc0-73b738f79139-metrics-certs\") pod \"network-metrics-daemon-f5bf4\" (UID: \"90958440-ae13-4f74-8dc0-73b738f79139\") " pod="openshift-multus/network-metrics-daemon-f5bf4" Apr 24 14:23:54.389149 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386473 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-host-kubelet\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.389149 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386515 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-etc-openvswitch\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.389149 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386580 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-etc-openvswitch\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.389149 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386614 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/562db2e5-0cd5-452e-9208-ceb438b32453-iptables-alerter-script\") pod \"iptables-alerter-v5zt2\" (UID: \"562db2e5-0cd5-452e-9208-ceb438b32453\") " pod="openshift-network-operator/iptables-alerter-v5zt2" Apr 24 14:23:54.389767 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386649 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-host-slash\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.389767 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386680 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/67732aa7-95a9-4a96-9e40-a2a525b77a52-hosts-file\") pod \"node-resolver-7l5sm\" (UID: \"67732aa7-95a9-4a96-9e40-a2a525b77a52\") " pod="openshift-dns/node-resolver-7l5sm" Apr 24 14:23:54.389767 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386691 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-run-ovn\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.389767 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386707 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-host-cni-netd\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.389767 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386737 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-host-run-ovn-kubernetes\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.389767 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386783 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-etc-sysctl-d\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.389767 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386789 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/562db2e5-0cd5-452e-9208-ceb438b32453-host-slash\") pod \"iptables-alerter-v5zt2\" (UID: \"562db2e5-0cd5-452e-9208-ceb438b32453\") " pod="openshift-network-operator/iptables-alerter-v5zt2" Apr 24 14:23:54.389767 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386787 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-sys\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.389767 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386834 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/03c62f6c-4b55-4a95-82a9-797e22c98930-cnibin\") pod \"multus-additional-cni-plugins-96scn\" (UID: \"03c62f6c-4b55-4a95-82a9-797e22c98930\") " pod="openshift-multus/multus-additional-cni-plugins-96scn" Apr 24 14:23:54.389767 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386864 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-lib-modules\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.389767 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386879 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/77cbed1f-c18d-4a43-bcc3-230e23453a72-ovnkube-script-lib\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.389767 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.386952 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/77cbed1f-c18d-4a43-bcc3-230e23453a72-host-kubelet\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.389767 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.387194 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-cni-binary-copy\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.389767 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.387265 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-system-cni-dir\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.389767 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.389220 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-etc-tuned\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.389767 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.389235 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0ebb20a0-f56e-442b-8216-34313a45e74c-agent-certs\") pod \"konnectivity-agent-zrhjw\" (UID: \"0ebb20a0-f56e-442b-8216-34313a45e74c\") " pod="kube-system/konnectivity-agent-zrhjw" Apr 24 14:23:54.389767 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.389435 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/77cbed1f-c18d-4a43-bcc3-230e23453a72-ovn-node-metrics-cert\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.393681 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.393649 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-685dx\" (UniqueName: \"kubernetes.io/projected/e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4-kube-api-access-685dx\") pod \"multus-qhq2p\" (UID: \"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4\") " pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.393809 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.393653 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-27wzg\" (UniqueName: \"kubernetes.io/projected/77cbed1f-c18d-4a43-bcc3-230e23453a72-kube-api-access-27wzg\") pod \"ovnkube-node-pbjt7\" (UID: \"77cbed1f-c18d-4a43-bcc3-230e23453a72\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.394777 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.394685 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwntb\" (UniqueName: \"kubernetes.io/projected/9bc5ec31-bf4b-46de-abb2-21a96bd6160a-kube-api-access-xwntb\") pod \"node-ca-629tj\" (UID: \"9bc5ec31-bf4b-46de-abb2-21a96bd6160a\") " pod="openshift-image-registry/node-ca-629tj" Apr 24 14:23:54.395309 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.395291 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ltcb\" (UniqueName: \"kubernetes.io/projected/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-kube-api-access-8ltcb\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.395851 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.395826 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds59v\" (UniqueName: \"kubernetes.io/projected/03c62f6c-4b55-4a95-82a9-797e22c98930-kube-api-access-ds59v\") pod \"multus-additional-cni-plugins-96scn\" (UID: \"03c62f6c-4b55-4a95-82a9-797e22c98930\") " pod="openshift-multus/multus-additional-cni-plugins-96scn" Apr 24 14:23:54.396154 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.396137 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b2d9a8e7-5a32-4737-96ef-7cb670736a8b-tmp\") pod \"tuned-52ssx\" (UID: \"b2d9a8e7-5a32-4737-96ef-7cb670736a8b\") " pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.396365 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.396345 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvc9d\" (UniqueName: \"kubernetes.io/projected/65b901d7-b766-4277-901d-fd586db8de46-kube-api-access-mvc9d\") pod \"aws-ebs-csi-driver-node-lp66z\" (UID: \"65b901d7-b766-4277-901d-fd586db8de46\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lp66z" Apr 24 14:23:54.396486 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.396466 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlqth\" (UniqueName: \"kubernetes.io/projected/562db2e5-0cd5-452e-9208-ceb438b32453-kube-api-access-zlqth\") pod \"iptables-alerter-v5zt2\" (UID: \"562db2e5-0cd5-452e-9208-ceb438b32453\") " pod="openshift-network-operator/iptables-alerter-v5zt2" Apr 24 14:23:54.397487 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.397448 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctskh\" (UniqueName: \"kubernetes.io/projected/67732aa7-95a9-4a96-9e40-a2a525b77a52-kube-api-access-ctskh\") pod \"node-resolver-7l5sm\" (UID: \"67732aa7-95a9-4a96-9e40-a2a525b77a52\") " pod="openshift-dns/node-resolver-7l5sm" Apr 24 14:23:54.417232 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.417176 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-77.ec2.internal" event={"ID":"935c2debf0f5dd0986c38f8610254d0b","Type":"ContainerStarted","Data":"36b6509f8cb040cfacda8c1d02be64052dc74b375babf5c9750f7e15db8a738e"} Apr 24 14:23:54.418211 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.418186 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-77.ec2.internal" event={"ID":"864e358909ff6c4ae03cf379c8abd2b5","Type":"ContainerStarted","Data":"3a839d14956beae01fd12298dd7fdc5aaecf9c80b14a0ef0dbe5460f8d448339"} Apr 24 14:23:54.487233 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.487200 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90958440-ae13-4f74-8dc0-73b738f79139-metrics-certs\") pod \"network-metrics-daemon-f5bf4\" (UID: \"90958440-ae13-4f74-8dc0-73b738f79139\") " pod="openshift-multus/network-metrics-daemon-f5bf4" Apr 24 14:23:54.487372 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.487239 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7c72r\" (UniqueName: \"kubernetes.io/projected/3640d87a-9a53-41b1-912e-39a56479c86c-kube-api-access-7c72r\") pod \"network-check-target-fkvkp\" (UID: \"3640d87a-9a53-41b1-912e-39a56479c86c\") " pod="openshift-network-diagnostics/network-check-target-fkvkp" Apr 24 14:23:54.487372 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.487273 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7pw9t\" (UniqueName: \"kubernetes.io/projected/90958440-ae13-4f74-8dc0-73b738f79139-kube-api-access-7pw9t\") pod \"network-metrics-daemon-f5bf4\" (UID: \"90958440-ae13-4f74-8dc0-73b738f79139\") " pod="openshift-multus/network-metrics-daemon-f5bf4" Apr 24 14:23:54.487488 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:54.487383 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:23:54.487546 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:54.487511 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90958440-ae13-4f74-8dc0-73b738f79139-metrics-certs podName:90958440-ae13-4f74-8dc0-73b738f79139 nodeName:}" failed. No retries permitted until 2026-04-24 14:23:54.987487858 +0000 UTC m=+3.113363562 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/90958440-ae13-4f74-8dc0-73b738f79139-metrics-certs") pod "network-metrics-daemon-f5bf4" (UID: "90958440-ae13-4f74-8dc0-73b738f79139") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:23:54.498190 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:54.498126 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:23:54.498190 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:54.498152 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:23:54.498190 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:54.498164 2570 projected.go:194] Error preparing data for projected volume kube-api-access-7c72r for pod openshift-network-diagnostics/network-check-target-fkvkp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:23:54.498399 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:54.498226 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3640d87a-9a53-41b1-912e-39a56479c86c-kube-api-access-7c72r podName:3640d87a-9a53-41b1-912e-39a56479c86c nodeName:}" failed. No retries permitted until 2026-04-24 14:23:54.998209894 +0000 UTC m=+3.124085591 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-7c72r" (UniqueName: "kubernetes.io/projected/3640d87a-9a53-41b1-912e-39a56479c86c-kube-api-access-7c72r") pod "network-check-target-fkvkp" (UID: "3640d87a-9a53-41b1-912e-39a56479c86c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:23:54.500831 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.500804 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pw9t\" (UniqueName: \"kubernetes.io/projected/90958440-ae13-4f74-8dc0-73b738f79139-kube-api-access-7pw9t\") pod \"network-metrics-daemon-f5bf4\" (UID: \"90958440-ae13-4f74-8dc0-73b738f79139\") " pod="openshift-multus/network-metrics-daemon-f5bf4" Apr 24 14:23:54.570000 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.569966 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-zrhjw" Apr 24 14:23:54.577880 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.577859 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-52ssx" Apr 24 14:23:54.586509 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.586484 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7l5sm" Apr 24 14:23:54.593905 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.593887 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-629tj" Apr 24 14:23:54.601478 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.601453 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:23:54.609082 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.609064 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-96scn" Apr 24 14:23:54.615740 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.615722 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lp66z" Apr 24 14:23:54.623316 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.623282 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-v5zt2" Apr 24 14:23:54.628908 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.628889 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qhq2p" Apr 24 14:23:54.775167 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.775097 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:23:54.990756 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:54.990711 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90958440-ae13-4f74-8dc0-73b738f79139-metrics-certs\") pod \"network-metrics-daemon-f5bf4\" (UID: \"90958440-ae13-4f74-8dc0-73b738f79139\") " pod="openshift-multus/network-metrics-daemon-f5bf4" Apr 24 14:23:54.990933 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:54.990872 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:23:54.990993 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:54.990952 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90958440-ae13-4f74-8dc0-73b738f79139-metrics-certs podName:90958440-ae13-4f74-8dc0-73b738f79139 nodeName:}" failed. No retries permitted until 2026-04-24 14:23:55.990932345 +0000 UTC m=+4.116808044 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/90958440-ae13-4f74-8dc0-73b738f79139-metrics-certs") pod "network-metrics-daemon-f5bf4" (UID: "90958440-ae13-4f74-8dc0-73b738f79139") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:23:55.087777 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:55.087747 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2d9a8e7_5a32_4737_96ef_7cb670736a8b.slice/crio-ecce088a8548b73a5c61902d8dbb39b1ce6f9ac2d14f61e7273f16b924e3745c WatchSource:0}: Error finding container ecce088a8548b73a5c61902d8dbb39b1ce6f9ac2d14f61e7273f16b924e3745c: Status 404 returned error can't find the container with id ecce088a8548b73a5c61902d8dbb39b1ce6f9ac2d14f61e7273f16b924e3745c Apr 24 14:23:55.090979 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:55.090955 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ebb20a0_f56e_442b_8216_34313a45e74c.slice/crio-f06d7a1da695e6b23f5ee6f32acd3abc30ea7a951f909b748a7e7d84a1093c8c WatchSource:0}: Error finding container f06d7a1da695e6b23f5ee6f32acd3abc30ea7a951f909b748a7e7d84a1093c8c: Status 404 returned error can't find the container with id f06d7a1da695e6b23f5ee6f32acd3abc30ea7a951f909b748a7e7d84a1093c8c Apr 24 14:23:55.091081 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:55.091052 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7c72r\" (UniqueName: \"kubernetes.io/projected/3640d87a-9a53-41b1-912e-39a56479c86c-kube-api-access-7c72r\") pod \"network-check-target-fkvkp\" (UID: \"3640d87a-9a53-41b1-912e-39a56479c86c\") " pod="openshift-network-diagnostics/network-check-target-fkvkp" Apr 24 14:23:55.091235 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:55.091216 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:23:55.091275 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:55.091243 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:23:55.091275 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:55.091258 2570 projected.go:194] Error preparing data for projected volume kube-api-access-7c72r for pod openshift-network-diagnostics/network-check-target-fkvkp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:23:55.091344 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:55.091333 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3640d87a-9a53-41b1-912e-39a56479c86c-kube-api-access-7c72r podName:3640d87a-9a53-41b1-912e-39a56479c86c nodeName:}" failed. No retries permitted until 2026-04-24 14:23:56.091311103 +0000 UTC m=+4.217186789 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-7c72r" (UniqueName: "kubernetes.io/projected/3640d87a-9a53-41b1-912e-39a56479c86c-kube-api-access-7c72r") pod "network-check-target-fkvkp" (UID: "3640d87a-9a53-41b1-912e-39a56479c86c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:23:55.091988 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:55.091741 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03c62f6c_4b55_4a95_82a9_797e22c98930.slice/crio-7d84cf9db6e7cc37ddfe0ef6496205e92f30a4f3972d02a56cde8369a1dfd680 WatchSource:0}: Error finding container 7d84cf9db6e7cc37ddfe0ef6496205e92f30a4f3972d02a56cde8369a1dfd680: Status 404 returned error can't find the container with id 7d84cf9db6e7cc37ddfe0ef6496205e92f30a4f3972d02a56cde8369a1dfd680 Apr 24 14:23:55.092518 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:55.092492 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65b901d7_b766_4277_901d_fd586db8de46.slice/crio-dead578d6d2dba5d71c8d4cc3d657aa7aabdb8613e04cfce6b46f884e8cb42b4 WatchSource:0}: Error finding container dead578d6d2dba5d71c8d4cc3d657aa7aabdb8613e04cfce6b46f884e8cb42b4: Status 404 returned error can't find the container with id dead578d6d2dba5d71c8d4cc3d657aa7aabdb8613e04cfce6b46f884e8cb42b4 Apr 24 14:23:55.093849 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:55.093810 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod562db2e5_0cd5_452e_9208_ceb438b32453.slice/crio-0b2978da910ddfc71c5b8792e62ad243833fa4fe19c8039d85217f6d584a872f WatchSource:0}: Error finding container 0b2978da910ddfc71c5b8792e62ad243833fa4fe19c8039d85217f6d584a872f: Status 404 returned error can't find the container with id 0b2978da910ddfc71c5b8792e62ad243833fa4fe19c8039d85217f6d584a872f Apr 24 14:23:55.095063 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:55.095038 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bc5ec31_bf4b_46de_abb2_21a96bd6160a.slice/crio-8611368cf7d7bb5bb6fe689c98bf425222f98913cda51c8b08cd8126ee35cc51 WatchSource:0}: Error finding container 8611368cf7d7bb5bb6fe689c98bf425222f98913cda51c8b08cd8126ee35cc51: Status 404 returned error can't find the container with id 8611368cf7d7bb5bb6fe689c98bf425222f98913cda51c8b08cd8126ee35cc51 Apr 24 14:23:55.095952 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:55.095754 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77cbed1f_c18d_4a43_bcc3_230e23453a72.slice/crio-a22a5e60d829d8ec8330c143299a4ff7e19165e6d4072f50cb778f8fdd5cb500 WatchSource:0}: Error finding container a22a5e60d829d8ec8330c143299a4ff7e19165e6d4072f50cb778f8fdd5cb500: Status 404 returned error can't find the container with id a22a5e60d829d8ec8330c143299a4ff7e19165e6d4072f50cb778f8fdd5cb500 Apr 24 14:23:55.096927 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:23:55.096840 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode37f0e7d_3cac_4f35_a7d4_ba18c81bc1c4.slice/crio-cc276b6181192bdbad117b5a78fab217d8811a3d91e3afe718d63c2751df34fb WatchSource:0}: Error finding container cc276b6181192bdbad117b5a78fab217d8811a3d91e3afe718d63c2751df34fb: Status 404 returned error can't find the container with id cc276b6181192bdbad117b5a78fab217d8811a3d91e3afe718d63c2751df34fb Apr 24 14:23:55.311527 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:55.311449 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 14:18:53 +0000 UTC" deadline="2027-11-09 20:53:51.315771272 +0000 UTC" Apr 24 14:23:55.311527 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:55.311480 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13542h29m56.004293724s" Apr 24 14:23:55.413636 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:55.413588 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fkvkp" Apr 24 14:23:55.413793 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:55.413710 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fkvkp" podUID="3640d87a-9a53-41b1-912e-39a56479c86c" Apr 24 14:23:55.421150 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:55.421123 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-77.ec2.internal" event={"ID":"935c2debf0f5dd0986c38f8610254d0b","Type":"ContainerStarted","Data":"3b2413d9982b0ceb9adcf0bf435068494604a1a59c930cfaa22d65475a2661a6"} Apr 24 14:23:55.422257 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:55.422225 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qhq2p" event={"ID":"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4","Type":"ContainerStarted","Data":"cc276b6181192bdbad117b5a78fab217d8811a3d91e3afe718d63c2751df34fb"} Apr 24 14:23:55.423279 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:55.423240 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-629tj" event={"ID":"9bc5ec31-bf4b-46de-abb2-21a96bd6160a","Type":"ContainerStarted","Data":"8611368cf7d7bb5bb6fe689c98bf425222f98913cda51c8b08cd8126ee35cc51"} Apr 24 14:23:55.424205 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:55.424180 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-v5zt2" event={"ID":"562db2e5-0cd5-452e-9208-ceb438b32453","Type":"ContainerStarted","Data":"0b2978da910ddfc71c5b8792e62ad243833fa4fe19c8039d85217f6d584a872f"} Apr 24 14:23:55.425380 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:55.425358 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-96scn" event={"ID":"03c62f6c-4b55-4a95-82a9-797e22c98930","Type":"ContainerStarted","Data":"7d84cf9db6e7cc37ddfe0ef6496205e92f30a4f3972d02a56cde8369a1dfd680"} Apr 24 14:23:55.426277 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:55.426255 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-zrhjw" event={"ID":"0ebb20a0-f56e-442b-8216-34313a45e74c","Type":"ContainerStarted","Data":"f06d7a1da695e6b23f5ee6f32acd3abc30ea7a951f909b748a7e7d84a1093c8c"} Apr 24 14:23:55.427151 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:55.427130 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-52ssx" event={"ID":"b2d9a8e7-5a32-4737-96ef-7cb670736a8b","Type":"ContainerStarted","Data":"ecce088a8548b73a5c61902d8dbb39b1ce6f9ac2d14f61e7273f16b924e3745c"} Apr 24 14:23:55.427990 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:55.427966 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" event={"ID":"77cbed1f-c18d-4a43-bcc3-230e23453a72","Type":"ContainerStarted","Data":"a22a5e60d829d8ec8330c143299a4ff7e19165e6d4072f50cb778f8fdd5cb500"} Apr 24 14:23:55.429681 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:55.429661 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lp66z" event={"ID":"65b901d7-b766-4277-901d-fd586db8de46","Type":"ContainerStarted","Data":"dead578d6d2dba5d71c8d4cc3d657aa7aabdb8613e04cfce6b46f884e8cb42b4"} Apr 24 14:23:55.430615 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:55.430595 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7l5sm" event={"ID":"67732aa7-95a9-4a96-9e40-a2a525b77a52","Type":"ContainerStarted","Data":"239191cf2dd5ddd33de9e1e45bd133d25cd5d4059d02e6453dfa12f776a43865"} Apr 24 14:23:55.434046 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:55.434012 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-77.ec2.internal" podStartSLOduration=2.434001295 podStartE2EDuration="2.434001295s" podCreationTimestamp="2026-04-24 14:23:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:23:55.433796726 +0000 UTC m=+3.559672430" watchObservedRunningTime="2026-04-24 14:23:55.434001295 +0000 UTC m=+3.559876999" Apr 24 14:23:55.999553 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:55.999300 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90958440-ae13-4f74-8dc0-73b738f79139-metrics-certs\") pod \"network-metrics-daemon-f5bf4\" (UID: \"90958440-ae13-4f74-8dc0-73b738f79139\") " pod="openshift-multus/network-metrics-daemon-f5bf4" Apr 24 14:23:55.999833 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:55.999813 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:23:55.999896 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:55.999890 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90958440-ae13-4f74-8dc0-73b738f79139-metrics-certs podName:90958440-ae13-4f74-8dc0-73b738f79139 nodeName:}" failed. No retries permitted until 2026-04-24 14:23:57.999867528 +0000 UTC m=+6.125743216 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/90958440-ae13-4f74-8dc0-73b738f79139-metrics-certs") pod "network-metrics-daemon-f5bf4" (UID: "90958440-ae13-4f74-8dc0-73b738f79139") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:23:56.100290 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:56.100248 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7c72r\" (UniqueName: \"kubernetes.io/projected/3640d87a-9a53-41b1-912e-39a56479c86c-kube-api-access-7c72r\") pod \"network-check-target-fkvkp\" (UID: \"3640d87a-9a53-41b1-912e-39a56479c86c\") " pod="openshift-network-diagnostics/network-check-target-fkvkp" Apr 24 14:23:56.100482 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:56.100437 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:23:56.100482 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:56.100458 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:23:56.100482 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:56.100471 2570 projected.go:194] Error preparing data for projected volume kube-api-access-7c72r for pod openshift-network-diagnostics/network-check-target-fkvkp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:23:56.100672 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:56.100534 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3640d87a-9a53-41b1-912e-39a56479c86c-kube-api-access-7c72r podName:3640d87a-9a53-41b1-912e-39a56479c86c nodeName:}" failed. No retries permitted until 2026-04-24 14:23:58.100513374 +0000 UTC m=+6.226389062 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-7c72r" (UniqueName: "kubernetes.io/projected/3640d87a-9a53-41b1-912e-39a56479c86c-kube-api-access-7c72r") pod "network-check-target-fkvkp" (UID: "3640d87a-9a53-41b1-912e-39a56479c86c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:23:56.415789 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:56.415712 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f5bf4" Apr 24 14:23:56.416224 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:56.415857 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f5bf4" podUID="90958440-ae13-4f74-8dc0-73b738f79139" Apr 24 14:23:56.450333 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:56.450009 2570 generic.go:358] "Generic (PLEG): container finished" podID="864e358909ff6c4ae03cf379c8abd2b5" containerID="571d442cd9cb8da7760671520c282866acb4356dd0ca09f7dfe82872204f435f" exitCode=0 Apr 24 14:23:56.450333 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:56.450129 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-77.ec2.internal" event={"ID":"864e358909ff6c4ae03cf379c8abd2b5","Type":"ContainerDied","Data":"571d442cd9cb8da7760671520c282866acb4356dd0ca09f7dfe82872204f435f"} Apr 24 14:23:57.413402 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:57.413367 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fkvkp" Apr 24 14:23:57.413659 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:57.413509 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fkvkp" podUID="3640d87a-9a53-41b1-912e-39a56479c86c" Apr 24 14:23:57.462816 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:57.462768 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-77.ec2.internal" event={"ID":"864e358909ff6c4ae03cf379c8abd2b5","Type":"ContainerStarted","Data":"414d805ffe5cb86a9f9636eea11515ac46ebb4ae80e7b19d64b8de10c15ee8fa"} Apr 24 14:23:57.483726 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:57.483669 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-77.ec2.internal" podStartSLOduration=4.483650773 podStartE2EDuration="4.483650773s" podCreationTimestamp="2026-04-24 14:23:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:23:57.482846678 +0000 UTC m=+5.608722384" watchObservedRunningTime="2026-04-24 14:23:57.483650773 +0000 UTC m=+5.609526479" Apr 24 14:23:57.523494 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:57.523460 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-682qd"] Apr 24 14:23:57.525941 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:57.525417 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-682qd" Apr 24 14:23:57.525941 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:57.525501 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-682qd" podUID="8020bd6f-9604-464e-8df7-c76530a5af7c" Apr 24 14:23:57.612718 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:57.612450 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8020bd6f-9604-464e-8df7-c76530a5af7c-original-pull-secret\") pod \"global-pull-secret-syncer-682qd\" (UID: \"8020bd6f-9604-464e-8df7-c76530a5af7c\") " pod="kube-system/global-pull-secret-syncer-682qd" Apr 24 14:23:57.612718 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:57.612512 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8020bd6f-9604-464e-8df7-c76530a5af7c-kubelet-config\") pod \"global-pull-secret-syncer-682qd\" (UID: \"8020bd6f-9604-464e-8df7-c76530a5af7c\") " pod="kube-system/global-pull-secret-syncer-682qd" Apr 24 14:23:57.612718 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:57.612555 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8020bd6f-9604-464e-8df7-c76530a5af7c-dbus\") pod \"global-pull-secret-syncer-682qd\" (UID: \"8020bd6f-9604-464e-8df7-c76530a5af7c\") " pod="kube-system/global-pull-secret-syncer-682qd" Apr 24 14:23:57.713138 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:57.713054 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8020bd6f-9604-464e-8df7-c76530a5af7c-original-pull-secret\") pod \"global-pull-secret-syncer-682qd\" (UID: \"8020bd6f-9604-464e-8df7-c76530a5af7c\") " pod="kube-system/global-pull-secret-syncer-682qd" Apr 24 14:23:57.713138 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:57.713105 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8020bd6f-9604-464e-8df7-c76530a5af7c-kubelet-config\") pod \"global-pull-secret-syncer-682qd\" (UID: \"8020bd6f-9604-464e-8df7-c76530a5af7c\") " pod="kube-system/global-pull-secret-syncer-682qd" Apr 24 14:23:57.713350 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:57.713143 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8020bd6f-9604-464e-8df7-c76530a5af7c-dbus\") pod \"global-pull-secret-syncer-682qd\" (UID: \"8020bd6f-9604-464e-8df7-c76530a5af7c\") " pod="kube-system/global-pull-secret-syncer-682qd" Apr 24 14:23:57.713350 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:57.713329 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8020bd6f-9604-464e-8df7-c76530a5af7c-dbus\") pod \"global-pull-secret-syncer-682qd\" (UID: \"8020bd6f-9604-464e-8df7-c76530a5af7c\") " pod="kube-system/global-pull-secret-syncer-682qd" Apr 24 14:23:57.713481 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:57.713439 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 14:23:57.713542 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:57.713517 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8020bd6f-9604-464e-8df7-c76530a5af7c-original-pull-secret podName:8020bd6f-9604-464e-8df7-c76530a5af7c nodeName:}" failed. No retries permitted until 2026-04-24 14:23:58.21349729 +0000 UTC m=+6.339372989 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8020bd6f-9604-464e-8df7-c76530a5af7c-original-pull-secret") pod "global-pull-secret-syncer-682qd" (UID: "8020bd6f-9604-464e-8df7-c76530a5af7c") : object "kube-system"/"original-pull-secret" not registered Apr 24 14:23:57.713817 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:57.713796 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8020bd6f-9604-464e-8df7-c76530a5af7c-kubelet-config\") pod \"global-pull-secret-syncer-682qd\" (UID: \"8020bd6f-9604-464e-8df7-c76530a5af7c\") " pod="kube-system/global-pull-secret-syncer-682qd" Apr 24 14:23:58.015105 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:58.015018 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90958440-ae13-4f74-8dc0-73b738f79139-metrics-certs\") pod \"network-metrics-daemon-f5bf4\" (UID: \"90958440-ae13-4f74-8dc0-73b738f79139\") " pod="openshift-multus/network-metrics-daemon-f5bf4" Apr 24 14:23:58.015263 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:58.015165 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:23:58.015263 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:58.015250 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90958440-ae13-4f74-8dc0-73b738f79139-metrics-certs podName:90958440-ae13-4f74-8dc0-73b738f79139 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:02.015230219 +0000 UTC m=+10.141105925 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/90958440-ae13-4f74-8dc0-73b738f79139-metrics-certs") pod "network-metrics-daemon-f5bf4" (UID: "90958440-ae13-4f74-8dc0-73b738f79139") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:23:58.116393 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:58.115807 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7c72r\" (UniqueName: \"kubernetes.io/projected/3640d87a-9a53-41b1-912e-39a56479c86c-kube-api-access-7c72r\") pod \"network-check-target-fkvkp\" (UID: \"3640d87a-9a53-41b1-912e-39a56479c86c\") " pod="openshift-network-diagnostics/network-check-target-fkvkp" Apr 24 14:23:58.116393 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:58.115988 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:23:58.116393 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:58.116007 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:23:58.116393 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:58.116016 2570 projected.go:194] Error preparing data for projected volume kube-api-access-7c72r for pod openshift-network-diagnostics/network-check-target-fkvkp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:23:58.116393 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:58.116071 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3640d87a-9a53-41b1-912e-39a56479c86c-kube-api-access-7c72r podName:3640d87a-9a53-41b1-912e-39a56479c86c nodeName:}" failed. No retries permitted until 2026-04-24 14:24:02.116052833 +0000 UTC m=+10.241928516 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-7c72r" (UniqueName: "kubernetes.io/projected/3640d87a-9a53-41b1-912e-39a56479c86c-kube-api-access-7c72r") pod "network-check-target-fkvkp" (UID: "3640d87a-9a53-41b1-912e-39a56479c86c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:23:58.217237 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:58.217128 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8020bd6f-9604-464e-8df7-c76530a5af7c-original-pull-secret\") pod \"global-pull-secret-syncer-682qd\" (UID: \"8020bd6f-9604-464e-8df7-c76530a5af7c\") " pod="kube-system/global-pull-secret-syncer-682qd" Apr 24 14:23:58.217393 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:58.217301 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 14:23:58.217393 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:58.217381 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8020bd6f-9604-464e-8df7-c76530a5af7c-original-pull-secret podName:8020bd6f-9604-464e-8df7-c76530a5af7c nodeName:}" failed. No retries permitted until 2026-04-24 14:23:59.217362205 +0000 UTC m=+7.343237890 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8020bd6f-9604-464e-8df7-c76530a5af7c-original-pull-secret") pod "global-pull-secret-syncer-682qd" (UID: "8020bd6f-9604-464e-8df7-c76530a5af7c") : object "kube-system"/"original-pull-secret" not registered Apr 24 14:23:58.416955 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:58.416434 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f5bf4" Apr 24 14:23:58.416955 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:58.416577 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f5bf4" podUID="90958440-ae13-4f74-8dc0-73b738f79139" Apr 24 14:23:59.226381 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:59.226343 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8020bd6f-9604-464e-8df7-c76530a5af7c-original-pull-secret\") pod \"global-pull-secret-syncer-682qd\" (UID: \"8020bd6f-9604-464e-8df7-c76530a5af7c\") " pod="kube-system/global-pull-secret-syncer-682qd" Apr 24 14:23:59.226833 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:59.226527 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 14:23:59.226833 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:59.226589 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8020bd6f-9604-464e-8df7-c76530a5af7c-original-pull-secret podName:8020bd6f-9604-464e-8df7-c76530a5af7c nodeName:}" failed. No retries permitted until 2026-04-24 14:24:01.226571331 +0000 UTC m=+9.352447019 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8020bd6f-9604-464e-8df7-c76530a5af7c-original-pull-secret") pod "global-pull-secret-syncer-682qd" (UID: "8020bd6f-9604-464e-8df7-c76530a5af7c") : object "kube-system"/"original-pull-secret" not registered Apr 24 14:23:59.413407 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:59.413373 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-682qd" Apr 24 14:23:59.413407 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:23:59.413419 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fkvkp" Apr 24 14:23:59.413911 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:59.413517 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-682qd" podUID="8020bd6f-9604-464e-8df7-c76530a5af7c" Apr 24 14:23:59.413911 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:23:59.413784 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fkvkp" podUID="3640d87a-9a53-41b1-912e-39a56479c86c" Apr 24 14:24:00.414042 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:00.413968 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f5bf4" Apr 24 14:24:00.414544 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:00.414120 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f5bf4" podUID="90958440-ae13-4f74-8dc0-73b738f79139" Apr 24 14:24:01.245400 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:01.244801 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8020bd6f-9604-464e-8df7-c76530a5af7c-original-pull-secret\") pod \"global-pull-secret-syncer-682qd\" (UID: \"8020bd6f-9604-464e-8df7-c76530a5af7c\") " pod="kube-system/global-pull-secret-syncer-682qd" Apr 24 14:24:01.245400 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:01.244931 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:01.245400 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:01.245046 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8020bd6f-9604-464e-8df7-c76530a5af7c-original-pull-secret podName:8020bd6f-9604-464e-8df7-c76530a5af7c nodeName:}" failed. No retries permitted until 2026-04-24 14:24:05.245025039 +0000 UTC m=+13.370900736 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8020bd6f-9604-464e-8df7-c76530a5af7c-original-pull-secret") pod "global-pull-secret-syncer-682qd" (UID: "8020bd6f-9604-464e-8df7-c76530a5af7c") : object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:01.413520 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:01.413491 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-682qd" Apr 24 14:24:01.413756 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:01.413616 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-682qd" podUID="8020bd6f-9604-464e-8df7-c76530a5af7c" Apr 24 14:24:01.413756 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:01.413670 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fkvkp" Apr 24 14:24:01.413894 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:01.413782 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fkvkp" podUID="3640d87a-9a53-41b1-912e-39a56479c86c" Apr 24 14:24:02.051542 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:02.051495 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90958440-ae13-4f74-8dc0-73b738f79139-metrics-certs\") pod \"network-metrics-daemon-f5bf4\" (UID: \"90958440-ae13-4f74-8dc0-73b738f79139\") " pod="openshift-multus/network-metrics-daemon-f5bf4" Apr 24 14:24:02.052032 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:02.051702 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:02.052032 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:02.051773 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90958440-ae13-4f74-8dc0-73b738f79139-metrics-certs podName:90958440-ae13-4f74-8dc0-73b738f79139 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:10.051754335 +0000 UTC m=+18.177630023 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/90958440-ae13-4f74-8dc0-73b738f79139-metrics-certs") pod "network-metrics-daemon-f5bf4" (UID: "90958440-ae13-4f74-8dc0-73b738f79139") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:02.152642 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:02.152592 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7c72r\" (UniqueName: \"kubernetes.io/projected/3640d87a-9a53-41b1-912e-39a56479c86c-kube-api-access-7c72r\") pod \"network-check-target-fkvkp\" (UID: \"3640d87a-9a53-41b1-912e-39a56479c86c\") " pod="openshift-network-diagnostics/network-check-target-fkvkp" Apr 24 14:24:02.152846 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:02.152774 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:02.152846 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:02.152802 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:02.152846 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:02.152817 2570 projected.go:194] Error preparing data for projected volume kube-api-access-7c72r for pod openshift-network-diagnostics/network-check-target-fkvkp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:02.152965 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:02.152888 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3640d87a-9a53-41b1-912e-39a56479c86c-kube-api-access-7c72r podName:3640d87a-9a53-41b1-912e-39a56479c86c nodeName:}" failed. No retries permitted until 2026-04-24 14:24:10.152869591 +0000 UTC m=+18.278745280 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-7c72r" (UniqueName: "kubernetes.io/projected/3640d87a-9a53-41b1-912e-39a56479c86c-kube-api-access-7c72r") pod "network-check-target-fkvkp" (UID: "3640d87a-9a53-41b1-912e-39a56479c86c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:02.417142 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:02.416982 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f5bf4" Apr 24 14:24:02.417471 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:02.417418 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f5bf4" podUID="90958440-ae13-4f74-8dc0-73b738f79139" Apr 24 14:24:03.413375 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:03.413345 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-682qd" Apr 24 14:24:03.413920 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:03.413454 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-682qd" podUID="8020bd6f-9604-464e-8df7-c76530a5af7c" Apr 24 14:24:03.413920 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:03.413745 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fkvkp" Apr 24 14:24:03.413920 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:03.413849 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fkvkp" podUID="3640d87a-9a53-41b1-912e-39a56479c86c" Apr 24 14:24:04.414735 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:04.414696 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f5bf4" Apr 24 14:24:04.415173 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:04.414853 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f5bf4" podUID="90958440-ae13-4f74-8dc0-73b738f79139" Apr 24 14:24:05.276059 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:05.276023 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8020bd6f-9604-464e-8df7-c76530a5af7c-original-pull-secret\") pod \"global-pull-secret-syncer-682qd\" (UID: \"8020bd6f-9604-464e-8df7-c76530a5af7c\") " pod="kube-system/global-pull-secret-syncer-682qd" Apr 24 14:24:05.276269 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:05.276168 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:05.276269 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:05.276235 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8020bd6f-9604-464e-8df7-c76530a5af7c-original-pull-secret podName:8020bd6f-9604-464e-8df7-c76530a5af7c nodeName:}" failed. No retries permitted until 2026-04-24 14:24:13.276215422 +0000 UTC m=+21.402091105 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8020bd6f-9604-464e-8df7-c76530a5af7c-original-pull-secret") pod "global-pull-secret-syncer-682qd" (UID: "8020bd6f-9604-464e-8df7-c76530a5af7c") : object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:05.413919 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:05.413876 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-682qd" Apr 24 14:24:05.414091 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:05.414008 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-682qd" podUID="8020bd6f-9604-464e-8df7-c76530a5af7c" Apr 24 14:24:05.414456 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:05.414437 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fkvkp" Apr 24 14:24:05.414552 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:05.414533 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fkvkp" podUID="3640d87a-9a53-41b1-912e-39a56479c86c" Apr 24 14:24:06.416521 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:06.416481 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f5bf4" Apr 24 14:24:06.416973 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:06.416650 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f5bf4" podUID="90958440-ae13-4f74-8dc0-73b738f79139" Apr 24 14:24:07.414024 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:07.413988 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-682qd" Apr 24 14:24:07.414197 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:07.413989 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fkvkp" Apr 24 14:24:07.414197 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:07.414112 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-682qd" podUID="8020bd6f-9604-464e-8df7-c76530a5af7c" Apr 24 14:24:07.414325 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:07.414203 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fkvkp" podUID="3640d87a-9a53-41b1-912e-39a56479c86c" Apr 24 14:24:08.414114 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:08.414082 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f5bf4" Apr 24 14:24:08.414580 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:08.414219 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f5bf4" podUID="90958440-ae13-4f74-8dc0-73b738f79139" Apr 24 14:24:09.413287 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:09.413250 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fkvkp" Apr 24 14:24:09.413287 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:09.413297 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-682qd" Apr 24 14:24:09.413808 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:09.413394 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fkvkp" podUID="3640d87a-9a53-41b1-912e-39a56479c86c" Apr 24 14:24:09.413808 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:09.413538 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-682qd" podUID="8020bd6f-9604-464e-8df7-c76530a5af7c" Apr 24 14:24:10.111759 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:10.111719 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90958440-ae13-4f74-8dc0-73b738f79139-metrics-certs\") pod \"network-metrics-daemon-f5bf4\" (UID: \"90958440-ae13-4f74-8dc0-73b738f79139\") " pod="openshift-multus/network-metrics-daemon-f5bf4" Apr 24 14:24:10.112285 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:10.111903 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:10.112285 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:10.111988 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90958440-ae13-4f74-8dc0-73b738f79139-metrics-certs podName:90958440-ae13-4f74-8dc0-73b738f79139 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:26.111967213 +0000 UTC m=+34.237842898 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/90958440-ae13-4f74-8dc0-73b738f79139-metrics-certs") pod "network-metrics-daemon-f5bf4" (UID: "90958440-ae13-4f74-8dc0-73b738f79139") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:10.212381 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:10.212342 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7c72r\" (UniqueName: \"kubernetes.io/projected/3640d87a-9a53-41b1-912e-39a56479c86c-kube-api-access-7c72r\") pod \"network-check-target-fkvkp\" (UID: \"3640d87a-9a53-41b1-912e-39a56479c86c\") " pod="openshift-network-diagnostics/network-check-target-fkvkp" Apr 24 14:24:10.212573 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:10.212523 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:10.212573 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:10.212540 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:10.212573 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:10.212550 2570 projected.go:194] Error preparing data for projected volume kube-api-access-7c72r for pod openshift-network-diagnostics/network-check-target-fkvkp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:10.212766 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:10.212609 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3640d87a-9a53-41b1-912e-39a56479c86c-kube-api-access-7c72r podName:3640d87a-9a53-41b1-912e-39a56479c86c nodeName:}" failed. No retries permitted until 2026-04-24 14:24:26.212587694 +0000 UTC m=+34.338463380 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-7c72r" (UniqueName: "kubernetes.io/projected/3640d87a-9a53-41b1-912e-39a56479c86c-kube-api-access-7c72r") pod "network-check-target-fkvkp" (UID: "3640d87a-9a53-41b1-912e-39a56479c86c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:10.413421 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:10.413332 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f5bf4" Apr 24 14:24:10.413718 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:10.413474 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f5bf4" podUID="90958440-ae13-4f74-8dc0-73b738f79139" Apr 24 14:24:11.414163 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:11.414128 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fkvkp" Apr 24 14:24:11.414686 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:11.414128 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-682qd" Apr 24 14:24:11.414686 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:11.414239 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fkvkp" podUID="3640d87a-9a53-41b1-912e-39a56479c86c" Apr 24 14:24:11.414686 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:11.414338 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-682qd" podUID="8020bd6f-9604-464e-8df7-c76530a5af7c" Apr 24 14:24:12.414206 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:12.414169 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f5bf4" Apr 24 14:24:12.414573 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:12.414262 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f5bf4" podUID="90958440-ae13-4f74-8dc0-73b738f79139" Apr 24 14:24:13.339351 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:13.339306 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8020bd6f-9604-464e-8df7-c76530a5af7c-original-pull-secret\") pod \"global-pull-secret-syncer-682qd\" (UID: \"8020bd6f-9604-464e-8df7-c76530a5af7c\") " pod="kube-system/global-pull-secret-syncer-682qd" Apr 24 14:24:13.339532 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:13.339449 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:13.339532 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:13.339527 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8020bd6f-9604-464e-8df7-c76530a5af7c-original-pull-secret podName:8020bd6f-9604-464e-8df7-c76530a5af7c nodeName:}" failed. No retries permitted until 2026-04-24 14:24:29.33951197 +0000 UTC m=+37.465387671 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8020bd6f-9604-464e-8df7-c76530a5af7c-original-pull-secret") pod "global-pull-secret-syncer-682qd" (UID: "8020bd6f-9604-464e-8df7-c76530a5af7c") : object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:13.414022 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:13.413825 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fkvkp" Apr 24 14:24:13.414214 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:13.414110 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fkvkp" podUID="3640d87a-9a53-41b1-912e-39a56479c86c" Apr 24 14:24:13.414425 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:13.413856 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-682qd" Apr 24 14:24:13.415056 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:13.414512 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-682qd" podUID="8020bd6f-9604-464e-8df7-c76530a5af7c" Apr 24 14:24:13.494290 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:13.494205 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qhq2p" event={"ID":"e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4","Type":"ContainerStarted","Data":"dce90c9089140a14a301b20c0edaa28be36986fa8975ade86d2d86ea06c29ccd"} Apr 24 14:24:13.495640 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:13.495595 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-629tj" event={"ID":"9bc5ec31-bf4b-46de-abb2-21a96bd6160a","Type":"ContainerStarted","Data":"7289a80291a4961a310595b8bc3cfd83d75edcefc256406a12a43ce14b558869"} Apr 24 14:24:13.497158 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:13.497126 2570 generic.go:358] "Generic (PLEG): container finished" podID="03c62f6c-4b55-4a95-82a9-797e22c98930" containerID="baccd847bc7f91a3da1aed63844e6aa753601e005186cfc48febc5ec05d7ca63" exitCode=0 Apr 24 14:24:13.497313 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:13.497283 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-96scn" event={"ID":"03c62f6c-4b55-4a95-82a9-797e22c98930","Type":"ContainerDied","Data":"baccd847bc7f91a3da1aed63844e6aa753601e005186cfc48febc5ec05d7ca63"} Apr 24 14:24:13.499266 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:13.499232 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-zrhjw" event={"ID":"0ebb20a0-f56e-442b-8216-34313a45e74c","Type":"ContainerStarted","Data":"e9ef33c6f44b5ff7bf2a3f1d42cc2797ce9895f642831270557212a379bb5679"} Apr 24 14:24:13.500794 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:13.500769 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-52ssx" event={"ID":"b2d9a8e7-5a32-4737-96ef-7cb670736a8b","Type":"ContainerStarted","Data":"44d57668abfce2988d2421cd0015394be2e1531167a7f2fd5225831f140acb50"} Apr 24 14:24:13.504078 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:13.504051 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" event={"ID":"77cbed1f-c18d-4a43-bcc3-230e23453a72","Type":"ContainerStarted","Data":"c27947391b409c21a3c1afa78d413130ca23b48e6d6109da893a8f6a2a63826a"} Apr 24 14:24:13.504196 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:13.504082 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" event={"ID":"77cbed1f-c18d-4a43-bcc3-230e23453a72","Type":"ContainerStarted","Data":"9ea376178f93d22676a3fb690c23ffc734a1d7a54e3f0264b82cea195cdb9fa1"} Apr 24 14:24:13.504196 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:13.504092 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" event={"ID":"77cbed1f-c18d-4a43-bcc3-230e23453a72","Type":"ContainerStarted","Data":"422f30e128d3e02a1bc071c0a59c76e12803720c760778ff8605c396355974ab"} Apr 24 14:24:13.504196 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:13.504100 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" event={"ID":"77cbed1f-c18d-4a43-bcc3-230e23453a72","Type":"ContainerStarted","Data":"33838a48b8e98c9bbf98dac349f176cd45ec6be23fd33d24c71db9f811f34a32"} Apr 24 14:24:13.504196 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:13.504110 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" event={"ID":"77cbed1f-c18d-4a43-bcc3-230e23453a72","Type":"ContainerStarted","Data":"626a71aae5d0b88f4fe1d04087fb84089f963cccebbee955fb328b4918886665"} Apr 24 14:24:13.504196 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:13.504118 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" event={"ID":"77cbed1f-c18d-4a43-bcc3-230e23453a72","Type":"ContainerStarted","Data":"6dd9e83168283d83f9049fa7a98e68810f79872d9ab972ff58ba3ca2914b256d"} Apr 24 14:24:13.505458 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:13.505392 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lp66z" event={"ID":"65b901d7-b766-4277-901d-fd586db8de46","Type":"ContainerStarted","Data":"57e9059e477288be7bb09c5c1e6cd79e5001b6a6fb2cfba9604eb69758fbcc7c"} Apr 24 14:24:13.506741 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:13.506719 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7l5sm" event={"ID":"67732aa7-95a9-4a96-9e40-a2a525b77a52","Type":"ContainerStarted","Data":"359d6ac9c30b9b54ef38fb59467eb7ab041a89dfe17281a23c8554903efee0c5"} Apr 24 14:24:13.512072 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:13.512010 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-qhq2p" podStartSLOduration=3.948527805 podStartE2EDuration="21.511993084s" podCreationTimestamp="2026-04-24 14:23:52 +0000 UTC" firstStartedPulling="2026-04-24 14:23:55.099199997 +0000 UTC m=+3.225075685" lastFinishedPulling="2026-04-24 14:24:12.662665267 +0000 UTC m=+20.788540964" observedRunningTime="2026-04-24 14:24:13.511705814 +0000 UTC m=+21.637581518" watchObservedRunningTime="2026-04-24 14:24:13.511993084 +0000 UTC m=+21.637868791" Apr 24 14:24:13.544378 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:13.544317 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-zrhjw" podStartSLOduration=12.36308786 podStartE2EDuration="21.544302916s" podCreationTimestamp="2026-04-24 14:23:52 +0000 UTC" firstStartedPulling="2026-04-24 14:23:55.09275707 +0000 UTC m=+3.218632753" lastFinishedPulling="2026-04-24 14:24:04.273972123 +0000 UTC m=+12.399847809" observedRunningTime="2026-04-24 14:24:13.525518076 +0000 UTC m=+21.651393782" watchObservedRunningTime="2026-04-24 14:24:13.544302916 +0000 UTC m=+21.670178663" Apr 24 14:24:13.562883 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:13.562834 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-52ssx" podStartSLOduration=4.008331028 podStartE2EDuration="21.562814578s" podCreationTimestamp="2026-04-24 14:23:52 +0000 UTC" firstStartedPulling="2026-04-24 14:23:55.090417113 +0000 UTC m=+3.216292798" lastFinishedPulling="2026-04-24 14:24:12.644900666 +0000 UTC m=+20.770776348" observedRunningTime="2026-04-24 14:24:13.545750319 +0000 UTC m=+21.671626029" watchObservedRunningTime="2026-04-24 14:24:13.562814578 +0000 UTC m=+21.688690285" Apr 24 14:24:13.563063 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:13.562973 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-7l5sm" podStartSLOduration=4.007252795 podStartE2EDuration="21.562966592s" podCreationTimestamp="2026-04-24 14:23:52 +0000 UTC" firstStartedPulling="2026-04-24 14:23:55.087561423 +0000 UTC m=+3.213437106" lastFinishedPulling="2026-04-24 14:24:12.643275207 +0000 UTC m=+20.769150903" observedRunningTime="2026-04-24 14:24:13.56222757 +0000 UTC m=+21.688103284" watchObservedRunningTime="2026-04-24 14:24:13.562966592 +0000 UTC m=+21.688842311" Apr 24 14:24:13.577097 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:13.577039 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-629tj" podStartSLOduration=4.03090423 podStartE2EDuration="21.577024724s" podCreationTimestamp="2026-04-24 14:23:52 +0000 UTC" firstStartedPulling="2026-04-24 14:23:55.09715687 +0000 UTC m=+3.223032553" lastFinishedPulling="2026-04-24 14:24:12.643277355 +0000 UTC m=+20.769153047" observedRunningTime="2026-04-24 14:24:13.5766562 +0000 UTC m=+21.702531907" watchObservedRunningTime="2026-04-24 14:24:13.577024724 +0000 UTC m=+21.702900429" Apr 24 14:24:13.825117 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:13.825091 2570 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 14:24:14.346523 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:14.346406 2570 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T14:24:13.825110229Z","UUID":"b28d1459-7dc0-4932-a874-caba920475cf","Handler":null,"Name":"","Endpoint":""} Apr 24 14:24:14.350263 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:14.350227 2570 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 14:24:14.350263 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:14.350262 2570 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 14:24:14.413492 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:14.413452 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f5bf4" Apr 24 14:24:14.413695 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:14.413597 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f5bf4" podUID="90958440-ae13-4f74-8dc0-73b738f79139" Apr 24 14:24:14.511273 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:14.511239 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lp66z" event={"ID":"65b901d7-b766-4277-901d-fd586db8de46","Type":"ContainerStarted","Data":"94c3da0d21dc2cd13ac8404db658ca8b90ebd03a92d67d10ecca3bbffe609dcb"} Apr 24 14:24:14.513915 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:14.513885 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-v5zt2" event={"ID":"562db2e5-0cd5-452e-9208-ceb438b32453","Type":"ContainerStarted","Data":"884a13ba5f664c3e43f3f9fad310c4c94a455c8fa25006d41d8e777222f2d9ed"} Apr 24 14:24:14.528075 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:14.528007 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-v5zt2" podStartSLOduration=4.9801843649999995 podStartE2EDuration="22.527991992s" podCreationTimestamp="2026-04-24 14:23:52 +0000 UTC" firstStartedPulling="2026-04-24 14:23:55.095518539 +0000 UTC m=+3.221394222" lastFinishedPulling="2026-04-24 14:24:12.643326151 +0000 UTC m=+20.769201849" observedRunningTime="2026-04-24 14:24:14.52774352 +0000 UTC m=+22.653619226" watchObservedRunningTime="2026-04-24 14:24:14.527991992 +0000 UTC m=+22.653867694" Apr 24 14:24:15.413564 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:15.413531 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fkvkp" Apr 24 14:24:15.413715 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:15.413530 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-682qd" Apr 24 14:24:15.413715 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:15.413670 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fkvkp" podUID="3640d87a-9a53-41b1-912e-39a56479c86c" Apr 24 14:24:15.413839 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:15.413765 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-682qd" podUID="8020bd6f-9604-464e-8df7-c76530a5af7c" Apr 24 14:24:15.518019 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:15.517982 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" event={"ID":"77cbed1f-c18d-4a43-bcc3-230e23453a72","Type":"ContainerStarted","Data":"6b29f1269ef1b2955e219033e53ad238309d078736b41b3939a23fc7fd983171"} Apr 24 14:24:15.519951 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:15.519914 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lp66z" event={"ID":"65b901d7-b766-4277-901d-fd586db8de46","Type":"ContainerStarted","Data":"75deecc374ccb799de4f45151e6931859cf5c5cc150bba9f336efbddc5310df4"} Apr 24 14:24:15.537040 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:15.536991 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lp66z" podStartSLOduration=3.999245952 podStartE2EDuration="23.536978896s" podCreationTimestamp="2026-04-24 14:23:52 +0000 UTC" firstStartedPulling="2026-04-24 14:23:55.094389502 +0000 UTC m=+3.220265187" lastFinishedPulling="2026-04-24 14:24:14.632122446 +0000 UTC m=+22.757998131" observedRunningTime="2026-04-24 14:24:15.536956567 +0000 UTC m=+23.662832274" watchObservedRunningTime="2026-04-24 14:24:15.536978896 +0000 UTC m=+23.662854716" Apr 24 14:24:16.413522 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:16.413478 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f5bf4" Apr 24 14:24:16.413715 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:16.413648 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f5bf4" podUID="90958440-ae13-4f74-8dc0-73b738f79139" Apr 24 14:24:17.413811 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:17.413774 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-682qd" Apr 24 14:24:17.414429 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:17.413774 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fkvkp" Apr 24 14:24:17.414429 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:17.413898 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-682qd" podUID="8020bd6f-9604-464e-8df7-c76530a5af7c" Apr 24 14:24:17.414429 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:17.413943 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fkvkp" podUID="3640d87a-9a53-41b1-912e-39a56479c86c" Apr 24 14:24:17.864044 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:17.863758 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-zrhjw" Apr 24 14:24:17.864895 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:17.864873 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-zrhjw" Apr 24 14:24:18.413187 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:18.413155 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f5bf4" Apr 24 14:24:18.413344 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:18.413264 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f5bf4" podUID="90958440-ae13-4f74-8dc0-73b738f79139" Apr 24 14:24:18.527350 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:18.527312 2570 generic.go:358] "Generic (PLEG): container finished" podID="03c62f6c-4b55-4a95-82a9-797e22c98930" containerID="91c89356b5730655343e517cbf06f4266ae7799c315d3daf0c674d1db0694515" exitCode=0 Apr 24 14:24:18.527862 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:18.527395 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-96scn" event={"ID":"03c62f6c-4b55-4a95-82a9-797e22c98930","Type":"ContainerDied","Data":"91c89356b5730655343e517cbf06f4266ae7799c315d3daf0c674d1db0694515"} Apr 24 14:24:18.530870 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:18.530846 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" event={"ID":"77cbed1f-c18d-4a43-bcc3-230e23453a72","Type":"ContainerStarted","Data":"9def58205320027a1dbbb3622b1e2407560af516f8e17eed38cd92ba6f914882"} Apr 24 14:24:18.531226 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:18.531111 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-zrhjw" Apr 24 14:24:18.531226 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:18.531139 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:24:18.531226 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:18.531149 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:24:18.534595 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:18.532066 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-zrhjw" Apr 24 14:24:18.547206 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:18.547146 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:24:18.548919 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:18.548898 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:24:18.601530 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:18.601477 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" podStartSLOduration=8.997748939 podStartE2EDuration="26.601460956s" podCreationTimestamp="2026-04-24 14:23:52 +0000 UTC" firstStartedPulling="2026-04-24 14:23:55.098469278 +0000 UTC m=+3.224344961" lastFinishedPulling="2026-04-24 14:24:12.70218129 +0000 UTC m=+20.828056978" observedRunningTime="2026-04-24 14:24:18.599503142 +0000 UTC m=+26.725378856" watchObservedRunningTime="2026-04-24 14:24:18.601460956 +0000 UTC m=+26.727336661" Apr 24 14:24:19.414137 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:19.414106 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-682qd" Apr 24 14:24:19.414273 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:19.414228 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-682qd" podUID="8020bd6f-9604-464e-8df7-c76530a5af7c" Apr 24 14:24:19.414309 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:19.414285 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fkvkp" Apr 24 14:24:19.414387 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:19.414368 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fkvkp" podUID="3640d87a-9a53-41b1-912e-39a56479c86c" Apr 24 14:24:19.533463 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:19.533202 2570 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 14:24:19.800280 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:19.800249 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-682qd"] Apr 24 14:24:19.800416 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:19.800349 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-682qd" Apr 24 14:24:19.800470 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:19.800450 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-682qd" podUID="8020bd6f-9604-464e-8df7-c76530a5af7c" Apr 24 14:24:19.803218 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:19.803187 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-fkvkp"] Apr 24 14:24:19.803353 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:19.803280 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fkvkp" Apr 24 14:24:19.803404 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:19.803380 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fkvkp" podUID="3640d87a-9a53-41b1-912e-39a56479c86c" Apr 24 14:24:19.803798 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:19.803780 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-f5bf4"] Apr 24 14:24:19.803878 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:19.803867 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f5bf4" Apr 24 14:24:19.803950 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:19.803935 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f5bf4" podUID="90958440-ae13-4f74-8dc0-73b738f79139" Apr 24 14:24:20.536492 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:20.536420 2570 generic.go:358] "Generic (PLEG): container finished" podID="03c62f6c-4b55-4a95-82a9-797e22c98930" containerID="ea1a2aa4fda55bac168e403fda332e88376415d2746fc9f54b1a23e10464016e" exitCode=0 Apr 24 14:24:20.537095 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:20.536519 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-96scn" event={"ID":"03c62f6c-4b55-4a95-82a9-797e22c98930","Type":"ContainerDied","Data":"ea1a2aa4fda55bac168e403fda332e88376415d2746fc9f54b1a23e10464016e"} Apr 24 14:24:20.537095 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:20.536923 2570 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 14:24:21.413350 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:21.413318 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fkvkp" Apr 24 14:24:21.413350 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:21.413338 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-682qd" Apr 24 14:24:21.413650 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:21.413366 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f5bf4" Apr 24 14:24:21.413650 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:21.413467 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fkvkp" podUID="3640d87a-9a53-41b1-912e-39a56479c86c" Apr 24 14:24:21.413650 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:21.413566 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-682qd" podUID="8020bd6f-9604-464e-8df7-c76530a5af7c" Apr 24 14:24:21.413763 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:21.413671 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f5bf4" podUID="90958440-ae13-4f74-8dc0-73b738f79139" Apr 24 14:24:22.542186 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:22.542092 2570 generic.go:358] "Generic (PLEG): container finished" podID="03c62f6c-4b55-4a95-82a9-797e22c98930" containerID="94f6e9d4708e1f0997de69acbeb36c6eb059f6500f3c290a07aa362f168bfc8f" exitCode=0 Apr 24 14:24:22.542186 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:22.542153 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-96scn" event={"ID":"03c62f6c-4b55-4a95-82a9-797e22c98930","Type":"ContainerDied","Data":"94f6e9d4708e1f0997de69acbeb36c6eb059f6500f3c290a07aa362f168bfc8f"} Apr 24 14:24:23.413526 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:23.413483 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-682qd" Apr 24 14:24:23.413741 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:23.413600 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-682qd" podUID="8020bd6f-9604-464e-8df7-c76530a5af7c" Apr 24 14:24:23.413741 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:23.413639 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fkvkp" Apr 24 14:24:23.413741 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:23.413663 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f5bf4" Apr 24 14:24:23.413920 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:23.413749 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fkvkp" podUID="3640d87a-9a53-41b1-912e-39a56479c86c" Apr 24 14:24:23.413920 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:23.413845 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f5bf4" podUID="90958440-ae13-4f74-8dc0-73b738f79139" Apr 24 14:24:24.492198 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:24.492155 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:24:24.492615 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:24.492425 2570 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 14:24:24.512401 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:24.512357 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pbjt7" Apr 24 14:24:25.413574 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:25.413537 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fkvkp" Apr 24 14:24:25.413767 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:25.413589 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-682qd" Apr 24 14:24:25.413767 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:25.413699 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-682qd" podUID="8020bd6f-9604-464e-8df7-c76530a5af7c" Apr 24 14:24:25.413898 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:25.413791 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f5bf4" Apr 24 14:24:25.413948 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:25.413914 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f5bf4" podUID="90958440-ae13-4f74-8dc0-73b738f79139" Apr 24 14:24:25.414006 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:25.413988 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fkvkp" podUID="3640d87a-9a53-41b1-912e-39a56479c86c" Apr 24 14:24:25.733555 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:25.733345 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-77.ec2.internal" event="NodeReady" Apr 24 14:24:25.734079 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:25.733727 2570 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 14:24:25.776262 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:25.776225 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-hs6xt"] Apr 24 14:24:25.796871 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:25.796835 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-lkncg"] Apr 24 14:24:25.797057 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:25.797036 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hs6xt" Apr 24 14:24:25.800103 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:25.800072 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 14:24:25.800103 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:25.800091 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 14:24:25.800277 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:25.800079 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-m8ml5\"" Apr 24 14:24:25.815907 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:25.815878 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hs6xt"] Apr 24 14:24:25.815907 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:25.815909 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-lkncg"] Apr 24 14:24:25.816137 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:25.816029 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lkncg" Apr 24 14:24:25.819072 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:25.819042 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 14:24:25.819221 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:25.819103 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 14:24:25.819681 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:25.819661 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-84zkd\"" Apr 24 14:24:25.819931 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:25.819911 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 14:24:25.929328 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:25.929291 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e-metrics-tls\") pod \"dns-default-hs6xt\" (UID: \"2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e\") " pod="openshift-dns/dns-default-hs6xt" Apr 24 14:24:25.929328 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:25.929327 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ab7e98f-94b0-4d9b-9d46-6666f05fb86a-cert\") pod \"ingress-canary-lkncg\" (UID: \"7ab7e98f-94b0-4d9b-9d46-6666f05fb86a\") " pod="openshift-ingress-canary/ingress-canary-lkncg" Apr 24 14:24:25.929540 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:25.929349 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e-tmp-dir\") pod \"dns-default-hs6xt\" (UID: \"2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e\") " pod="openshift-dns/dns-default-hs6xt" Apr 24 14:24:25.929540 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:25.929437 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e-config-volume\") pod \"dns-default-hs6xt\" (UID: \"2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e\") " pod="openshift-dns/dns-default-hs6xt" Apr 24 14:24:25.929540 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:25.929501 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7swrm\" (UniqueName: \"kubernetes.io/projected/2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e-kube-api-access-7swrm\") pod \"dns-default-hs6xt\" (UID: \"2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e\") " pod="openshift-dns/dns-default-hs6xt" Apr 24 14:24:25.929653 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:25.929548 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv6tp\" (UniqueName: \"kubernetes.io/projected/7ab7e98f-94b0-4d9b-9d46-6666f05fb86a-kube-api-access-gv6tp\") pod \"ingress-canary-lkncg\" (UID: \"7ab7e98f-94b0-4d9b-9d46-6666f05fb86a\") " pod="openshift-ingress-canary/ingress-canary-lkncg" Apr 24 14:24:26.030931 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:26.030825 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e-config-volume\") pod \"dns-default-hs6xt\" (UID: \"2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e\") " pod="openshift-dns/dns-default-hs6xt" Apr 24 14:24:26.030931 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:26.030863 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7swrm\" (UniqueName: \"kubernetes.io/projected/2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e-kube-api-access-7swrm\") pod \"dns-default-hs6xt\" (UID: \"2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e\") " pod="openshift-dns/dns-default-hs6xt" Apr 24 14:24:26.030931 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:26.030893 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gv6tp\" (UniqueName: \"kubernetes.io/projected/7ab7e98f-94b0-4d9b-9d46-6666f05fb86a-kube-api-access-gv6tp\") pod \"ingress-canary-lkncg\" (UID: \"7ab7e98f-94b0-4d9b-9d46-6666f05fb86a\") " pod="openshift-ingress-canary/ingress-canary-lkncg" Apr 24 14:24:26.031195 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:26.030940 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e-metrics-tls\") pod \"dns-default-hs6xt\" (UID: \"2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e\") " pod="openshift-dns/dns-default-hs6xt" Apr 24 14:24:26.031195 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:26.030966 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ab7e98f-94b0-4d9b-9d46-6666f05fb86a-cert\") pod \"ingress-canary-lkncg\" (UID: \"7ab7e98f-94b0-4d9b-9d46-6666f05fb86a\") " pod="openshift-ingress-canary/ingress-canary-lkncg" Apr 24 14:24:26.031195 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:26.030997 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e-tmp-dir\") pod \"dns-default-hs6xt\" (UID: \"2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e\") " pod="openshift-dns/dns-default-hs6xt" Apr 24 14:24:26.031195 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:26.031088 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:24:26.031195 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:26.031092 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:24:26.031195 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:26.031170 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e-metrics-tls podName:2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e nodeName:}" failed. No retries permitted until 2026-04-24 14:24:26.531146346 +0000 UTC m=+34.657022034 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e-metrics-tls") pod "dns-default-hs6xt" (UID: "2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e") : secret "dns-default-metrics-tls" not found Apr 24 14:24:26.031422 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:26.031219 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ab7e98f-94b0-4d9b-9d46-6666f05fb86a-cert podName:7ab7e98f-94b0-4d9b-9d46-6666f05fb86a nodeName:}" failed. No retries permitted until 2026-04-24 14:24:26.531203259 +0000 UTC m=+34.657078942 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7ab7e98f-94b0-4d9b-9d46-6666f05fb86a-cert") pod "ingress-canary-lkncg" (UID: "7ab7e98f-94b0-4d9b-9d46-6666f05fb86a") : secret "canary-serving-cert" not found Apr 24 14:24:26.031422 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:26.031290 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e-tmp-dir\") pod \"dns-default-hs6xt\" (UID: \"2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e\") " pod="openshift-dns/dns-default-hs6xt" Apr 24 14:24:26.031422 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:26.031389 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e-config-volume\") pod \"dns-default-hs6xt\" (UID: \"2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e\") " pod="openshift-dns/dns-default-hs6xt" Apr 24 14:24:26.042675 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:26.042643 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7swrm\" (UniqueName: \"kubernetes.io/projected/2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e-kube-api-access-7swrm\") pod \"dns-default-hs6xt\" (UID: \"2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e\") " pod="openshift-dns/dns-default-hs6xt" Apr 24 14:24:26.042861 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:26.042785 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv6tp\" (UniqueName: \"kubernetes.io/projected/7ab7e98f-94b0-4d9b-9d46-6666f05fb86a-kube-api-access-gv6tp\") pod \"ingress-canary-lkncg\" (UID: \"7ab7e98f-94b0-4d9b-9d46-6666f05fb86a\") " pod="openshift-ingress-canary/ingress-canary-lkncg" Apr 24 14:24:26.131954 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:26.131908 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90958440-ae13-4f74-8dc0-73b738f79139-metrics-certs\") pod \"network-metrics-daemon-f5bf4\" (UID: \"90958440-ae13-4f74-8dc0-73b738f79139\") " pod="openshift-multus/network-metrics-daemon-f5bf4" Apr 24 14:24:26.132150 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:26.132061 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:26.132150 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:26.132134 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90958440-ae13-4f74-8dc0-73b738f79139-metrics-certs podName:90958440-ae13-4f74-8dc0-73b738f79139 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:58.132116578 +0000 UTC m=+66.257992282 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/90958440-ae13-4f74-8dc0-73b738f79139-metrics-certs") pod "network-metrics-daemon-f5bf4" (UID: "90958440-ae13-4f74-8dc0-73b738f79139") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:26.233472 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:26.232786 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7c72r\" (UniqueName: \"kubernetes.io/projected/3640d87a-9a53-41b1-912e-39a56479c86c-kube-api-access-7c72r\") pod \"network-check-target-fkvkp\" (UID: \"3640d87a-9a53-41b1-912e-39a56479c86c\") " pod="openshift-network-diagnostics/network-check-target-fkvkp" Apr 24 14:24:26.233472 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:26.233004 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:26.233472 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:26.233025 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:26.233472 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:26.233038 2570 projected.go:194] Error preparing data for projected volume kube-api-access-7c72r for pod openshift-network-diagnostics/network-check-target-fkvkp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:26.233472 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:26.233099 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3640d87a-9a53-41b1-912e-39a56479c86c-kube-api-access-7c72r podName:3640d87a-9a53-41b1-912e-39a56479c86c nodeName:}" failed. No retries permitted until 2026-04-24 14:24:58.233079177 +0000 UTC m=+66.358954877 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-7c72r" (UniqueName: "kubernetes.io/projected/3640d87a-9a53-41b1-912e-39a56479c86c-kube-api-access-7c72r") pod "network-check-target-fkvkp" (UID: "3640d87a-9a53-41b1-912e-39a56479c86c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:26.536357 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:26.536309 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e-metrics-tls\") pod \"dns-default-hs6xt\" (UID: \"2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e\") " pod="openshift-dns/dns-default-hs6xt" Apr 24 14:24:26.536563 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:26.536369 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ab7e98f-94b0-4d9b-9d46-6666f05fb86a-cert\") pod \"ingress-canary-lkncg\" (UID: \"7ab7e98f-94b0-4d9b-9d46-6666f05fb86a\") " pod="openshift-ingress-canary/ingress-canary-lkncg" Apr 24 14:24:26.536563 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:26.536464 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:24:26.536563 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:26.536509 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:24:26.536563 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:26.536547 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e-metrics-tls podName:2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e nodeName:}" failed. No retries permitted until 2026-04-24 14:24:27.536526879 +0000 UTC m=+35.662402576 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e-metrics-tls") pod "dns-default-hs6xt" (UID: "2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e") : secret "dns-default-metrics-tls" not found Apr 24 14:24:26.536852 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:26.536569 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ab7e98f-94b0-4d9b-9d46-6666f05fb86a-cert podName:7ab7e98f-94b0-4d9b-9d46-6666f05fb86a nodeName:}" failed. No retries permitted until 2026-04-24 14:24:27.536556095 +0000 UTC m=+35.662431778 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7ab7e98f-94b0-4d9b-9d46-6666f05fb86a-cert") pod "ingress-canary-lkncg" (UID: "7ab7e98f-94b0-4d9b-9d46-6666f05fb86a") : secret "canary-serving-cert" not found Apr 24 14:24:27.413371 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:27.413328 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fkvkp" Apr 24 14:24:27.413876 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:27.413328 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-682qd" Apr 24 14:24:27.413876 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:27.413342 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f5bf4" Apr 24 14:24:27.416728 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:27.416705 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 14:24:27.418019 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:27.417994 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-pdcbb\"" Apr 24 14:24:27.418140 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:27.418054 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 14:24:27.418193 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:27.418173 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 14:24:27.418243 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:27.418197 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 14:24:27.418344 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:27.418314 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-68j6s\"" Apr 24 14:24:27.545845 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:27.545811 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e-metrics-tls\") pod \"dns-default-hs6xt\" (UID: \"2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e\") " pod="openshift-dns/dns-default-hs6xt" Apr 24 14:24:27.545845 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:27.545851 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ab7e98f-94b0-4d9b-9d46-6666f05fb86a-cert\") pod \"ingress-canary-lkncg\" (UID: \"7ab7e98f-94b0-4d9b-9d46-6666f05fb86a\") " pod="openshift-ingress-canary/ingress-canary-lkncg" Apr 24 14:24:27.546092 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:27.545967 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:24:27.546092 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:27.545974 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:24:27.546092 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:27.546040 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ab7e98f-94b0-4d9b-9d46-6666f05fb86a-cert podName:7ab7e98f-94b0-4d9b-9d46-6666f05fb86a nodeName:}" failed. No retries permitted until 2026-04-24 14:24:29.546019553 +0000 UTC m=+37.671895237 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7ab7e98f-94b0-4d9b-9d46-6666f05fb86a-cert") pod "ingress-canary-lkncg" (UID: "7ab7e98f-94b0-4d9b-9d46-6666f05fb86a") : secret "canary-serving-cert" not found Apr 24 14:24:27.546092 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:27.546062 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e-metrics-tls podName:2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e nodeName:}" failed. No retries permitted until 2026-04-24 14:24:29.546051067 +0000 UTC m=+37.671926750 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e-metrics-tls") pod "dns-default-hs6xt" (UID: "2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e") : secret "dns-default-metrics-tls" not found Apr 24 14:24:29.361808 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:29.361776 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8020bd6f-9604-464e-8df7-c76530a5af7c-original-pull-secret\") pod \"global-pull-secret-syncer-682qd\" (UID: \"8020bd6f-9604-464e-8df7-c76530a5af7c\") " pod="kube-system/global-pull-secret-syncer-682qd" Apr 24 14:24:29.364791 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:29.364772 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8020bd6f-9604-464e-8df7-c76530a5af7c-original-pull-secret\") pod \"global-pull-secret-syncer-682qd\" (UID: \"8020bd6f-9604-464e-8df7-c76530a5af7c\") " pod="kube-system/global-pull-secret-syncer-682qd" Apr 24 14:24:29.534384 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:29.534343 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-682qd" Apr 24 14:24:29.560763 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:29.560728 2570 generic.go:358] "Generic (PLEG): container finished" podID="03c62f6c-4b55-4a95-82a9-797e22c98930" containerID="a1acb9fc9771f26ecf22f2895bdde14dc44bdcb67ef7838f5e11843234f4740a" exitCode=0 Apr 24 14:24:29.560925 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:29.560809 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-96scn" event={"ID":"03c62f6c-4b55-4a95-82a9-797e22c98930","Type":"ContainerDied","Data":"a1acb9fc9771f26ecf22f2895bdde14dc44bdcb67ef7838f5e11843234f4740a"} Apr 24 14:24:29.562977 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:29.562904 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e-metrics-tls\") pod \"dns-default-hs6xt\" (UID: \"2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e\") " pod="openshift-dns/dns-default-hs6xt" Apr 24 14:24:29.563090 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:29.562997 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ab7e98f-94b0-4d9b-9d46-6666f05fb86a-cert\") pod \"ingress-canary-lkncg\" (UID: \"7ab7e98f-94b0-4d9b-9d46-6666f05fb86a\") " pod="openshift-ingress-canary/ingress-canary-lkncg" Apr 24 14:24:29.563090 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:29.563035 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:24:29.563090 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:29.563090 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e-metrics-tls podName:2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e nodeName:}" failed. No retries permitted until 2026-04-24 14:24:33.563074909 +0000 UTC m=+41.688950593 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e-metrics-tls") pod "dns-default-hs6xt" (UID: "2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e") : secret "dns-default-metrics-tls" not found Apr 24 14:24:29.563194 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:29.563117 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:24:29.563194 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:29.563189 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ab7e98f-94b0-4d9b-9d46-6666f05fb86a-cert podName:7ab7e98f-94b0-4d9b-9d46-6666f05fb86a nodeName:}" failed. No retries permitted until 2026-04-24 14:24:33.563153844 +0000 UTC m=+41.689029550 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7ab7e98f-94b0-4d9b-9d46-6666f05fb86a-cert") pod "ingress-canary-lkncg" (UID: "7ab7e98f-94b0-4d9b-9d46-6666f05fb86a") : secret "canary-serving-cert" not found Apr 24 14:24:29.683256 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:29.682953 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-682qd"] Apr 24 14:24:29.688553 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:24:29.688519 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8020bd6f_9604_464e_8df7_c76530a5af7c.slice/crio-2a6a4720afcacd5c3bad3cbdbc6b2d5a95baa5457963bd518719206573a44765 WatchSource:0}: Error finding container 2a6a4720afcacd5c3bad3cbdbc6b2d5a95baa5457963bd518719206573a44765: Status 404 returned error can't find the container with id 2a6a4720afcacd5c3bad3cbdbc6b2d5a95baa5457963bd518719206573a44765 Apr 24 14:24:30.565876 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:30.565842 2570 generic.go:358] "Generic (PLEG): container finished" podID="03c62f6c-4b55-4a95-82a9-797e22c98930" containerID="1f80ef8671b064005f2ad46ad6a20c223f8149a0d31b28094a5eb12102cca2e0" exitCode=0 Apr 24 14:24:30.566596 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:30.565934 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-96scn" event={"ID":"03c62f6c-4b55-4a95-82a9-797e22c98930","Type":"ContainerDied","Data":"1f80ef8671b064005f2ad46ad6a20c223f8149a0d31b28094a5eb12102cca2e0"} Apr 24 14:24:30.567532 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:30.567501 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-682qd" event={"ID":"8020bd6f-9604-464e-8df7-c76530a5af7c","Type":"ContainerStarted","Data":"2a6a4720afcacd5c3bad3cbdbc6b2d5a95baa5457963bd518719206573a44765"} Apr 24 14:24:31.573335 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:31.573303 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-96scn" event={"ID":"03c62f6c-4b55-4a95-82a9-797e22c98930","Type":"ContainerStarted","Data":"9012a1aca2ed947056af0f957427ff4935db46c2dd97d80526f6c12f14778688"} Apr 24 14:24:31.596867 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:31.596810 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-96scn" podStartSLOduration=6.021721438 podStartE2EDuration="39.596793653s" podCreationTimestamp="2026-04-24 14:23:52 +0000 UTC" firstStartedPulling="2026-04-24 14:23:55.093369323 +0000 UTC m=+3.219245007" lastFinishedPulling="2026-04-24 14:24:28.668441539 +0000 UTC m=+36.794317222" observedRunningTime="2026-04-24 14:24:31.595154443 +0000 UTC m=+39.721030148" watchObservedRunningTime="2026-04-24 14:24:31.596793653 +0000 UTC m=+39.722669358" Apr 24 14:24:33.578423 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:33.578318 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-682qd" event={"ID":"8020bd6f-9604-464e-8df7-c76530a5af7c","Type":"ContainerStarted","Data":"c077983ae8dc173c0e7f6e701dbd7b65a80b3460c50c3f03640973951baccf82"} Apr 24 14:24:33.593388 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:33.593331 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-682qd" podStartSLOduration=33.038009553 podStartE2EDuration="36.593317057s" podCreationTimestamp="2026-04-24 14:23:57 +0000 UTC" firstStartedPulling="2026-04-24 14:24:29.69033603 +0000 UTC m=+37.816211717" lastFinishedPulling="2026-04-24 14:24:33.245643538 +0000 UTC m=+41.371519221" observedRunningTime="2026-04-24 14:24:33.592342013 +0000 UTC m=+41.718217717" watchObservedRunningTime="2026-04-24 14:24:33.593317057 +0000 UTC m=+41.719192761" Apr 24 14:24:33.599855 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:33.599829 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e-metrics-tls\") pod \"dns-default-hs6xt\" (UID: \"2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e\") " pod="openshift-dns/dns-default-hs6xt" Apr 24 14:24:33.599929 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:33.599863 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ab7e98f-94b0-4d9b-9d46-6666f05fb86a-cert\") pod \"ingress-canary-lkncg\" (UID: \"7ab7e98f-94b0-4d9b-9d46-6666f05fb86a\") " pod="openshift-ingress-canary/ingress-canary-lkncg" Apr 24 14:24:33.599980 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:33.599968 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:24:33.600012 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:33.599972 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:24:33.600048 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:33.600019 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ab7e98f-94b0-4d9b-9d46-6666f05fb86a-cert podName:7ab7e98f-94b0-4d9b-9d46-6666f05fb86a nodeName:}" failed. No retries permitted until 2026-04-24 14:24:41.600006314 +0000 UTC m=+49.725881997 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7ab7e98f-94b0-4d9b-9d46-6666f05fb86a-cert") pod "ingress-canary-lkncg" (UID: "7ab7e98f-94b0-4d9b-9d46-6666f05fb86a") : secret "canary-serving-cert" not found Apr 24 14:24:33.600048 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:33.600032 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e-metrics-tls podName:2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e nodeName:}" failed. No retries permitted until 2026-04-24 14:24:41.600025875 +0000 UTC m=+49.725901558 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e-metrics-tls") pod "dns-default-hs6xt" (UID: "2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e") : secret "dns-default-metrics-tls" not found Apr 24 14:24:41.657461 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:41.657406 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e-metrics-tls\") pod \"dns-default-hs6xt\" (UID: \"2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e\") " pod="openshift-dns/dns-default-hs6xt" Apr 24 14:24:41.657461 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:41.657455 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ab7e98f-94b0-4d9b-9d46-6666f05fb86a-cert\") pod \"ingress-canary-lkncg\" (UID: \"7ab7e98f-94b0-4d9b-9d46-6666f05fb86a\") " pod="openshift-ingress-canary/ingress-canary-lkncg" Apr 24 14:24:41.658034 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:41.657544 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:24:41.658034 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:41.657562 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:24:41.658034 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:41.657600 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ab7e98f-94b0-4d9b-9d46-6666f05fb86a-cert podName:7ab7e98f-94b0-4d9b-9d46-6666f05fb86a nodeName:}" failed. No retries permitted until 2026-04-24 14:24:57.65758736 +0000 UTC m=+65.783463043 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7ab7e98f-94b0-4d9b-9d46-6666f05fb86a-cert") pod "ingress-canary-lkncg" (UID: "7ab7e98f-94b0-4d9b-9d46-6666f05fb86a") : secret "canary-serving-cert" not found Apr 24 14:24:41.658034 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:41.657644 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e-metrics-tls podName:2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e nodeName:}" failed. No retries permitted until 2026-04-24 14:24:57.657605779 +0000 UTC m=+65.783481464 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e-metrics-tls") pod "dns-default-hs6xt" (UID: "2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e") : secret "dns-default-metrics-tls" not found Apr 24 14:24:57.669416 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:57.669372 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e-metrics-tls\") pod \"dns-default-hs6xt\" (UID: \"2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e\") " pod="openshift-dns/dns-default-hs6xt" Apr 24 14:24:57.669416 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:57.669411 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ab7e98f-94b0-4d9b-9d46-6666f05fb86a-cert\") pod \"ingress-canary-lkncg\" (UID: \"7ab7e98f-94b0-4d9b-9d46-6666f05fb86a\") " pod="openshift-ingress-canary/ingress-canary-lkncg" Apr 24 14:24:57.669884 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:57.669532 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:24:57.669884 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:57.669539 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:24:57.669884 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:57.669584 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ab7e98f-94b0-4d9b-9d46-6666f05fb86a-cert podName:7ab7e98f-94b0-4d9b-9d46-6666f05fb86a nodeName:}" failed. No retries permitted until 2026-04-24 14:25:29.669572245 +0000 UTC m=+97.795447928 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7ab7e98f-94b0-4d9b-9d46-6666f05fb86a-cert") pod "ingress-canary-lkncg" (UID: "7ab7e98f-94b0-4d9b-9d46-6666f05fb86a") : secret "canary-serving-cert" not found Apr 24 14:24:57.669884 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:57.669597 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e-metrics-tls podName:2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e nodeName:}" failed. No retries permitted until 2026-04-24 14:25:29.66959128 +0000 UTC m=+97.795466963 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e-metrics-tls") pod "dns-default-hs6xt" (UID: "2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e") : secret "dns-default-metrics-tls" not found Apr 24 14:24:58.172992 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:58.172950 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90958440-ae13-4f74-8dc0-73b738f79139-metrics-certs\") pod \"network-metrics-daemon-f5bf4\" (UID: \"90958440-ae13-4f74-8dc0-73b738f79139\") " pod="openshift-multus/network-metrics-daemon-f5bf4" Apr 24 14:24:58.176211 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:58.176190 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 14:24:58.183552 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:58.183527 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 14:24:58.183610 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:24:58.183605 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90958440-ae13-4f74-8dc0-73b738f79139-metrics-certs podName:90958440-ae13-4f74-8dc0-73b738f79139 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:02.183586604 +0000 UTC m=+130.309462303 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/90958440-ae13-4f74-8dc0-73b738f79139-metrics-certs") pod "network-metrics-daemon-f5bf4" (UID: "90958440-ae13-4f74-8dc0-73b738f79139") : secret "metrics-daemon-secret" not found Apr 24 14:24:58.274035 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:58.273980 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7c72r\" (UniqueName: \"kubernetes.io/projected/3640d87a-9a53-41b1-912e-39a56479c86c-kube-api-access-7c72r\") pod \"network-check-target-fkvkp\" (UID: \"3640d87a-9a53-41b1-912e-39a56479c86c\") " pod="openshift-network-diagnostics/network-check-target-fkvkp" Apr 24 14:24:58.276796 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:58.276779 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 14:24:58.287048 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:58.287026 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 14:24:58.298497 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:58.298463 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c72r\" (UniqueName: \"kubernetes.io/projected/3640d87a-9a53-41b1-912e-39a56479c86c-kube-api-access-7c72r\") pod \"network-check-target-fkvkp\" (UID: \"3640d87a-9a53-41b1-912e-39a56479c86c\") " pod="openshift-network-diagnostics/network-check-target-fkvkp" Apr 24 14:24:58.330244 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:58.330211 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-pdcbb\"" Apr 24 14:24:58.338233 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:58.338207 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fkvkp" Apr 24 14:24:58.452930 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:58.452853 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-fkvkp"] Apr 24 14:24:58.456373 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:24:58.456341 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3640d87a_9a53_41b1_912e_39a56479c86c.slice/crio-3fadded741155734fe35d22039da4ff533bd5318ac95638f9a24de02e8ca802b WatchSource:0}: Error finding container 3fadded741155734fe35d22039da4ff533bd5318ac95638f9a24de02e8ca802b: Status 404 returned error can't find the container with id 3fadded741155734fe35d22039da4ff533bd5318ac95638f9a24de02e8ca802b Apr 24 14:24:58.626741 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:24:58.626709 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-fkvkp" event={"ID":"3640d87a-9a53-41b1-912e-39a56479c86c","Type":"ContainerStarted","Data":"3fadded741155734fe35d22039da4ff533bd5318ac95638f9a24de02e8ca802b"} Apr 24 14:25:01.633977 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:01.633939 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-fkvkp" event={"ID":"3640d87a-9a53-41b1-912e-39a56479c86c","Type":"ContainerStarted","Data":"d9666968e10c88a26032b07651b15d53ec3b733cd2551c95b13681799a1230b5"} Apr 24 14:25:01.634353 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:01.634058 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-fkvkp" Apr 24 14:25:01.648931 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:01.648886 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-fkvkp" podStartSLOduration=66.93087274 podStartE2EDuration="1m9.648871741s" podCreationTimestamp="2026-04-24 14:23:52 +0000 UTC" firstStartedPulling="2026-04-24 14:24:58.458164761 +0000 UTC m=+66.584040448" lastFinishedPulling="2026-04-24 14:25:01.176163767 +0000 UTC m=+69.302039449" observedRunningTime="2026-04-24 14:25:01.648833955 +0000 UTC m=+69.774709672" watchObservedRunningTime="2026-04-24 14:25:01.648871741 +0000 UTC m=+69.774747451" Apr 24 14:25:29.703776 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:29.703738 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e-metrics-tls\") pod \"dns-default-hs6xt\" (UID: \"2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e\") " pod="openshift-dns/dns-default-hs6xt" Apr 24 14:25:29.703776 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:29.703777 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ab7e98f-94b0-4d9b-9d46-6666f05fb86a-cert\") pod \"ingress-canary-lkncg\" (UID: \"7ab7e98f-94b0-4d9b-9d46-6666f05fb86a\") " pod="openshift-ingress-canary/ingress-canary-lkncg" Apr 24 14:25:29.704221 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:25:29.703889 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:25:29.704221 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:25:29.703891 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:25:29.704221 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:25:29.703938 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ab7e98f-94b0-4d9b-9d46-6666f05fb86a-cert podName:7ab7e98f-94b0-4d9b-9d46-6666f05fb86a nodeName:}" failed. No retries permitted until 2026-04-24 14:26:33.703925084 +0000 UTC m=+161.829800768 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7ab7e98f-94b0-4d9b-9d46-6666f05fb86a-cert") pod "ingress-canary-lkncg" (UID: "7ab7e98f-94b0-4d9b-9d46-6666f05fb86a") : secret "canary-serving-cert" not found Apr 24 14:25:29.704221 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:25:29.703958 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e-metrics-tls podName:2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e nodeName:}" failed. No retries permitted until 2026-04-24 14:26:33.70394449 +0000 UTC m=+161.829820176 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e-metrics-tls") pod "dns-default-hs6xt" (UID: "2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e") : secret "dns-default-metrics-tls" not found Apr 24 14:25:32.638637 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:32.638598 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-fkvkp" Apr 24 14:25:48.744284 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.744253 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-p284p"] Apr 24 14:25:48.746257 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.746241 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p284p" Apr 24 14:25:48.750693 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.750672 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 14:25:48.750800 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.750673 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 24 14:25:48.752185 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.752159 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 14:25:48.752185 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.752182 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 24 14:25:48.752724 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.752181 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-bmxsf\"" Apr 24 14:25:48.758046 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.757999 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-p284p"] Apr 24 14:25:48.833728 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.833700 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c68ms\" (UniqueName: \"kubernetes.io/projected/788ee4fa-ade7-40eb-ba34-7fa69f106caf-kube-api-access-c68ms\") pod \"cluster-monitoring-operator-75587bd455-p284p\" (UID: \"788ee4fa-ade7-40eb-ba34-7fa69f106caf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p284p" Apr 24 14:25:48.833862 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.833733 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/788ee4fa-ade7-40eb-ba34-7fa69f106caf-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-p284p\" (UID: \"788ee4fa-ade7-40eb-ba34-7fa69f106caf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p284p" Apr 24 14:25:48.833862 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.833757 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/788ee4fa-ade7-40eb-ba34-7fa69f106caf-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-p284p\" (UID: \"788ee4fa-ade7-40eb-ba34-7fa69f106caf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p284p" Apr 24 14:25:48.934224 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.934191 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c68ms\" (UniqueName: \"kubernetes.io/projected/788ee4fa-ade7-40eb-ba34-7fa69f106caf-kube-api-access-c68ms\") pod \"cluster-monitoring-operator-75587bd455-p284p\" (UID: \"788ee4fa-ade7-40eb-ba34-7fa69f106caf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p284p" Apr 24 14:25:48.934224 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.934226 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/788ee4fa-ade7-40eb-ba34-7fa69f106caf-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-p284p\" (UID: \"788ee4fa-ade7-40eb-ba34-7fa69f106caf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p284p" Apr 24 14:25:48.934467 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.934254 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/788ee4fa-ade7-40eb-ba34-7fa69f106caf-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-p284p\" (UID: \"788ee4fa-ade7-40eb-ba34-7fa69f106caf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p284p" Apr 24 14:25:48.934467 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:25:48.934360 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 14:25:48.934467 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:25:48.934433 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/788ee4fa-ade7-40eb-ba34-7fa69f106caf-cluster-monitoring-operator-tls podName:788ee4fa-ade7-40eb-ba34-7fa69f106caf nodeName:}" failed. No retries permitted until 2026-04-24 14:25:49.434418789 +0000 UTC m=+117.560294471 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/788ee4fa-ade7-40eb-ba34-7fa69f106caf-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-p284p" (UID: "788ee4fa-ade7-40eb-ba34-7fa69f106caf") : secret "cluster-monitoring-operator-tls" not found Apr 24 14:25:48.934880 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.934862 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/788ee4fa-ade7-40eb-ba34-7fa69f106caf-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-p284p\" (UID: \"788ee4fa-ade7-40eb-ba34-7fa69f106caf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p284p" Apr 24 14:25:48.944964 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.944942 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wtc7l"] Apr 24 14:25:48.946559 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.946545 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wtc7l" Apr 24 14:25:48.948942 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.948922 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kmprr"] Apr 24 14:25:48.949447 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.949322 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 24 14:25:48.949447 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.949390 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-qbxz4\"" Apr 24 14:25:48.949939 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.949920 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 24 14:25:48.950031 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.949920 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 24 14:25:48.950031 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.949925 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 24 14:25:48.950643 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.950609 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-hp8qf"] Apr 24 14:25:48.950732 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.950718 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kmprr" Apr 24 14:25:48.952313 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.952291 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-hp8qf" Apr 24 14:25:48.953364 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.953349 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 24 14:25:48.953459 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.953349 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 24 14:25:48.953533 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.953493 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 24 14:25:48.953533 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.953501 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-49tt2\"" Apr 24 14:25:48.958481 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.955949 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c68ms\" (UniqueName: \"kubernetes.io/projected/788ee4fa-ade7-40eb-ba34-7fa69f106caf-kube-api-access-c68ms\") pod \"cluster-monitoring-operator-75587bd455-p284p\" (UID: \"788ee4fa-ade7-40eb-ba34-7fa69f106caf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p284p" Apr 24 14:25:48.958481 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.958016 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-zpwgk\"" Apr 24 14:25:48.959354 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.959338 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 14:25:48.959354 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.959349 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 24 14:25:48.959634 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.959607 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 24 14:25:48.960731 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.960714 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 14:25:48.976899 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.976882 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wtc7l"] Apr 24 14:25:48.977455 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.977439 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kmprr"] Apr 24 14:25:48.980415 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.980397 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-hp8qf"] Apr 24 14:25:48.991352 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:48.991330 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 24 14:25:49.035472 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.035417 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/562c06c8-1f5f-456c-81f3-a499e9769f12-tmp\") pod \"insights-operator-585dfdc468-hp8qf\" (UID: \"562c06c8-1f5f-456c-81f3-a499e9769f12\") " pod="openshift-insights/insights-operator-585dfdc468-hp8qf" Apr 24 14:25:49.035472 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.035456 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/562c06c8-1f5f-456c-81f3-a499e9769f12-service-ca-bundle\") pod \"insights-operator-585dfdc468-hp8qf\" (UID: \"562c06c8-1f5f-456c-81f3-a499e9769f12\") " pod="openshift-insights/insights-operator-585dfdc468-hp8qf" Apr 24 14:25:49.035602 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.035487 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/562c06c8-1f5f-456c-81f3-a499e9769f12-serving-cert\") pod \"insights-operator-585dfdc468-hp8qf\" (UID: \"562c06c8-1f5f-456c-81f3-a499e9769f12\") " pod="openshift-insights/insights-operator-585dfdc468-hp8qf" Apr 24 14:25:49.035602 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.035513 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq9z9\" (UniqueName: \"kubernetes.io/projected/01600e7e-779b-41ca-b62a-79288fc11666-kube-api-access-jq9z9\") pod \"kube-storage-version-migrator-operator-6769c5d45-wtc7l\" (UID: \"01600e7e-779b-41ca-b62a-79288fc11666\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wtc7l" Apr 24 14:25:49.035602 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.035534 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75c4l\" (UniqueName: \"kubernetes.io/projected/562c06c8-1f5f-456c-81f3-a499e9769f12-kube-api-access-75c4l\") pod \"insights-operator-585dfdc468-hp8qf\" (UID: \"562c06c8-1f5f-456c-81f3-a499e9769f12\") " pod="openshift-insights/insights-operator-585dfdc468-hp8qf" Apr 24 14:25:49.035602 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.035557 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmph2\" (UniqueName: \"kubernetes.io/projected/02995ee7-a76c-4aff-b87c-83b4540741cb-kube-api-access-rmph2\") pod \"cluster-samples-operator-6dc5bdb6b4-kmprr\" (UID: \"02995ee7-a76c-4aff-b87c-83b4540741cb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kmprr" Apr 24 14:25:49.035602 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.035597 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01600e7e-779b-41ca-b62a-79288fc11666-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-wtc7l\" (UID: \"01600e7e-779b-41ca-b62a-79288fc11666\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wtc7l" Apr 24 14:25:49.035786 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.035655 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/02995ee7-a76c-4aff-b87c-83b4540741cb-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-kmprr\" (UID: \"02995ee7-a76c-4aff-b87c-83b4540741cb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kmprr" Apr 24 14:25:49.035786 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.035689 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/562c06c8-1f5f-456c-81f3-a499e9769f12-snapshots\") pod \"insights-operator-585dfdc468-hp8qf\" (UID: \"562c06c8-1f5f-456c-81f3-a499e9769f12\") " pod="openshift-insights/insights-operator-585dfdc468-hp8qf" Apr 24 14:25:49.035786 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.035717 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01600e7e-779b-41ca-b62a-79288fc11666-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-wtc7l\" (UID: \"01600e7e-779b-41ca-b62a-79288fc11666\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wtc7l" Apr 24 14:25:49.035786 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.035740 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/562c06c8-1f5f-456c-81f3-a499e9769f12-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-hp8qf\" (UID: \"562c06c8-1f5f-456c-81f3-a499e9769f12\") " pod="openshift-insights/insights-operator-585dfdc468-hp8qf" Apr 24 14:25:49.136809 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.136774 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/562c06c8-1f5f-456c-81f3-a499e9769f12-service-ca-bundle\") pod \"insights-operator-585dfdc468-hp8qf\" (UID: \"562c06c8-1f5f-456c-81f3-a499e9769f12\") " pod="openshift-insights/insights-operator-585dfdc468-hp8qf" Apr 24 14:25:49.136809 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.136808 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/562c06c8-1f5f-456c-81f3-a499e9769f12-serving-cert\") pod \"insights-operator-585dfdc468-hp8qf\" (UID: \"562c06c8-1f5f-456c-81f3-a499e9769f12\") " pod="openshift-insights/insights-operator-585dfdc468-hp8qf" Apr 24 14:25:49.137029 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.136827 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jq9z9\" (UniqueName: \"kubernetes.io/projected/01600e7e-779b-41ca-b62a-79288fc11666-kube-api-access-jq9z9\") pod \"kube-storage-version-migrator-operator-6769c5d45-wtc7l\" (UID: \"01600e7e-779b-41ca-b62a-79288fc11666\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wtc7l" Apr 24 14:25:49.137029 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.136848 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75c4l\" (UniqueName: \"kubernetes.io/projected/562c06c8-1f5f-456c-81f3-a499e9769f12-kube-api-access-75c4l\") pod \"insights-operator-585dfdc468-hp8qf\" (UID: \"562c06c8-1f5f-456c-81f3-a499e9769f12\") " pod="openshift-insights/insights-operator-585dfdc468-hp8qf" Apr 24 14:25:49.137029 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.136883 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rmph2\" (UniqueName: \"kubernetes.io/projected/02995ee7-a76c-4aff-b87c-83b4540741cb-kube-api-access-rmph2\") pod \"cluster-samples-operator-6dc5bdb6b4-kmprr\" (UID: \"02995ee7-a76c-4aff-b87c-83b4540741cb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kmprr" Apr 24 14:25:49.137029 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.136907 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01600e7e-779b-41ca-b62a-79288fc11666-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-wtc7l\" (UID: \"01600e7e-779b-41ca-b62a-79288fc11666\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wtc7l" Apr 24 14:25:49.137029 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.136958 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/02995ee7-a76c-4aff-b87c-83b4540741cb-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-kmprr\" (UID: \"02995ee7-a76c-4aff-b87c-83b4540741cb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kmprr" Apr 24 14:25:49.137029 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:25:49.137022 2570 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 14:25:49.137212 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:25:49.137071 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02995ee7-a76c-4aff-b87c-83b4540741cb-samples-operator-tls podName:02995ee7-a76c-4aff-b87c-83b4540741cb nodeName:}" failed. No retries permitted until 2026-04-24 14:25:49.637053803 +0000 UTC m=+117.762929486 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/02995ee7-a76c-4aff-b87c-83b4540741cb-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-kmprr" (UID: "02995ee7-a76c-4aff-b87c-83b4540741cb") : secret "samples-operator-tls" not found Apr 24 14:25:49.137212 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.137088 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/562c06c8-1f5f-456c-81f3-a499e9769f12-snapshots\") pod \"insights-operator-585dfdc468-hp8qf\" (UID: \"562c06c8-1f5f-456c-81f3-a499e9769f12\") " pod="openshift-insights/insights-operator-585dfdc468-hp8qf" Apr 24 14:25:49.137212 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.137114 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01600e7e-779b-41ca-b62a-79288fc11666-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-wtc7l\" (UID: \"01600e7e-779b-41ca-b62a-79288fc11666\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wtc7l" Apr 24 14:25:49.137212 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.137143 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/562c06c8-1f5f-456c-81f3-a499e9769f12-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-hp8qf\" (UID: \"562c06c8-1f5f-456c-81f3-a499e9769f12\") " pod="openshift-insights/insights-operator-585dfdc468-hp8qf" Apr 24 14:25:49.137334 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.137255 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/562c06c8-1f5f-456c-81f3-a499e9769f12-tmp\") pod \"insights-operator-585dfdc468-hp8qf\" (UID: \"562c06c8-1f5f-456c-81f3-a499e9769f12\") " pod="openshift-insights/insights-operator-585dfdc468-hp8qf" Apr 24 14:25:49.137726 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.137698 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01600e7e-779b-41ca-b62a-79288fc11666-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-wtc7l\" (UID: \"01600e7e-779b-41ca-b62a-79288fc11666\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wtc7l" Apr 24 14:25:49.138043 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.138016 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/562c06c8-1f5f-456c-81f3-a499e9769f12-service-ca-bundle\") pod \"insights-operator-585dfdc468-hp8qf\" (UID: \"562c06c8-1f5f-456c-81f3-a499e9769f12\") " pod="openshift-insights/insights-operator-585dfdc468-hp8qf" Apr 24 14:25:49.138043 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.138040 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/562c06c8-1f5f-456c-81f3-a499e9769f12-tmp\") pod \"insights-operator-585dfdc468-hp8qf\" (UID: \"562c06c8-1f5f-456c-81f3-a499e9769f12\") " pod="openshift-insights/insights-operator-585dfdc468-hp8qf" Apr 24 14:25:49.138204 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.138137 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/562c06c8-1f5f-456c-81f3-a499e9769f12-snapshots\") pod \"insights-operator-585dfdc468-hp8qf\" (UID: \"562c06c8-1f5f-456c-81f3-a499e9769f12\") " pod="openshift-insights/insights-operator-585dfdc468-hp8qf" Apr 24 14:25:49.138285 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.138265 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/562c06c8-1f5f-456c-81f3-a499e9769f12-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-hp8qf\" (UID: \"562c06c8-1f5f-456c-81f3-a499e9769f12\") " pod="openshift-insights/insights-operator-585dfdc468-hp8qf" Apr 24 14:25:49.139431 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.139412 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01600e7e-779b-41ca-b62a-79288fc11666-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-wtc7l\" (UID: \"01600e7e-779b-41ca-b62a-79288fc11666\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wtc7l" Apr 24 14:25:49.139834 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.139817 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/562c06c8-1f5f-456c-81f3-a499e9769f12-serving-cert\") pod \"insights-operator-585dfdc468-hp8qf\" (UID: \"562c06c8-1f5f-456c-81f3-a499e9769f12\") " pod="openshift-insights/insights-operator-585dfdc468-hp8qf" Apr 24 14:25:49.148025 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.147994 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq9z9\" (UniqueName: \"kubernetes.io/projected/01600e7e-779b-41ca-b62a-79288fc11666-kube-api-access-jq9z9\") pod \"kube-storage-version-migrator-operator-6769c5d45-wtc7l\" (UID: \"01600e7e-779b-41ca-b62a-79288fc11666\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wtc7l" Apr 24 14:25:49.149090 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.149073 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-75c4l\" (UniqueName: \"kubernetes.io/projected/562c06c8-1f5f-456c-81f3-a499e9769f12-kube-api-access-75c4l\") pod \"insights-operator-585dfdc468-hp8qf\" (UID: \"562c06c8-1f5f-456c-81f3-a499e9769f12\") " pod="openshift-insights/insights-operator-585dfdc468-hp8qf" Apr 24 14:25:49.149263 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.149238 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmph2\" (UniqueName: \"kubernetes.io/projected/02995ee7-a76c-4aff-b87c-83b4540741cb-kube-api-access-rmph2\") pod \"cluster-samples-operator-6dc5bdb6b4-kmprr\" (UID: \"02995ee7-a76c-4aff-b87c-83b4540741cb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kmprr" Apr 24 14:25:49.263502 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.263465 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wtc7l" Apr 24 14:25:49.273544 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.273395 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-hp8qf" Apr 24 14:25:49.390855 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.390825 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wtc7l"] Apr 24 14:25:49.394056 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:25:49.394033 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01600e7e_779b_41ca_b62a_79288fc11666.slice/crio-41fddb65b355256f6e2eb350d614157d137f8f2a050c42e0c405c3a02bf8483f WatchSource:0}: Error finding container 41fddb65b355256f6e2eb350d614157d137f8f2a050c42e0c405c3a02bf8483f: Status 404 returned error can't find the container with id 41fddb65b355256f6e2eb350d614157d137f8f2a050c42e0c405c3a02bf8483f Apr 24 14:25:49.403226 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.403205 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-hp8qf"] Apr 24 14:25:49.410454 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:25:49.410432 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod562c06c8_1f5f_456c_81f3_a499e9769f12.slice/crio-e1b73b240f26330cf2303efaab563384ef5870a7527e28ba079389ba3bbe3f0e WatchSource:0}: Error finding container e1b73b240f26330cf2303efaab563384ef5870a7527e28ba079389ba3bbe3f0e: Status 404 returned error can't find the container with id e1b73b240f26330cf2303efaab563384ef5870a7527e28ba079389ba3bbe3f0e Apr 24 14:25:49.440396 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.440367 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/788ee4fa-ade7-40eb-ba34-7fa69f106caf-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-p284p\" (UID: \"788ee4fa-ade7-40eb-ba34-7fa69f106caf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p284p" Apr 24 14:25:49.440523 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:25:49.440472 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 14:25:49.440563 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:25:49.440525 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/788ee4fa-ade7-40eb-ba34-7fa69f106caf-cluster-monitoring-operator-tls podName:788ee4fa-ade7-40eb-ba34-7fa69f106caf nodeName:}" failed. No retries permitted until 2026-04-24 14:25:50.440508314 +0000 UTC m=+118.566383996 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/788ee4fa-ade7-40eb-ba34-7fa69f106caf-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-p284p" (UID: "788ee4fa-ade7-40eb-ba34-7fa69f106caf") : secret "cluster-monitoring-operator-tls" not found Apr 24 14:25:49.642037 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.641997 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/02995ee7-a76c-4aff-b87c-83b4540741cb-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-kmprr\" (UID: \"02995ee7-a76c-4aff-b87c-83b4540741cb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kmprr" Apr 24 14:25:49.642212 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:25:49.642141 2570 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 14:25:49.642212 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:25:49.642207 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02995ee7-a76c-4aff-b87c-83b4540741cb-samples-operator-tls podName:02995ee7-a76c-4aff-b87c-83b4540741cb nodeName:}" failed. No retries permitted until 2026-04-24 14:25:50.642191749 +0000 UTC m=+118.768067432 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/02995ee7-a76c-4aff-b87c-83b4540741cb-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-kmprr" (UID: "02995ee7-a76c-4aff-b87c-83b4540741cb") : secret "samples-operator-tls" not found Apr 24 14:25:49.720502 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.720465 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-hp8qf" event={"ID":"562c06c8-1f5f-456c-81f3-a499e9769f12","Type":"ContainerStarted","Data":"e1b73b240f26330cf2303efaab563384ef5870a7527e28ba079389ba3bbe3f0e"} Apr 24 14:25:49.721393 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:49.721373 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wtc7l" event={"ID":"01600e7e-779b-41ca-b62a-79288fc11666","Type":"ContainerStarted","Data":"41fddb65b355256f6e2eb350d614157d137f8f2a050c42e0c405c3a02bf8483f"} Apr 24 14:25:50.447779 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:50.447746 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/788ee4fa-ade7-40eb-ba34-7fa69f106caf-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-p284p\" (UID: \"788ee4fa-ade7-40eb-ba34-7fa69f106caf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p284p" Apr 24 14:25:50.448195 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:25:50.447873 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 14:25:50.448195 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:25:50.447930 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/788ee4fa-ade7-40eb-ba34-7fa69f106caf-cluster-monitoring-operator-tls podName:788ee4fa-ade7-40eb-ba34-7fa69f106caf nodeName:}" failed. No retries permitted until 2026-04-24 14:25:52.447911837 +0000 UTC m=+120.573787520 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/788ee4fa-ade7-40eb-ba34-7fa69f106caf-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-p284p" (UID: "788ee4fa-ade7-40eb-ba34-7fa69f106caf") : secret "cluster-monitoring-operator-tls" not found Apr 24 14:25:50.649442 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:50.649397 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/02995ee7-a76c-4aff-b87c-83b4540741cb-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-kmprr\" (UID: \"02995ee7-a76c-4aff-b87c-83b4540741cb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kmprr" Apr 24 14:25:50.649659 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:25:50.649557 2570 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 14:25:50.649659 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:25:50.649656 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02995ee7-a76c-4aff-b87c-83b4540741cb-samples-operator-tls podName:02995ee7-a76c-4aff-b87c-83b4540741cb nodeName:}" failed. No retries permitted until 2026-04-24 14:25:52.649636981 +0000 UTC m=+120.775512677 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/02995ee7-a76c-4aff-b87c-83b4540741cb-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-kmprr" (UID: "02995ee7-a76c-4aff-b87c-83b4540741cb") : secret "samples-operator-tls" not found Apr 24 14:25:51.726972 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:51.726923 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-hp8qf" event={"ID":"562c06c8-1f5f-456c-81f3-a499e9769f12","Type":"ContainerStarted","Data":"510d429e278f505e5ed7bf267e8fed911c3508e77305d9df3339e37ce89c6b3a"} Apr 24 14:25:51.731931 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:51.728814 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wtc7l" event={"ID":"01600e7e-779b-41ca-b62a-79288fc11666","Type":"ContainerStarted","Data":"f21aedf43ed61de8a2dd89bd2690dff04ba0d2f2cb1d666f131e23be22c84bdc"} Apr 24 14:25:51.743242 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:51.743192 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-hp8qf" podStartSLOduration=1.56660032 podStartE2EDuration="3.743178082s" podCreationTimestamp="2026-04-24 14:25:48 +0000 UTC" firstStartedPulling="2026-04-24 14:25:49.412089617 +0000 UTC m=+117.537965299" lastFinishedPulling="2026-04-24 14:25:51.588667363 +0000 UTC m=+119.714543061" observedRunningTime="2026-04-24 14:25:51.74276425 +0000 UTC m=+119.868639955" watchObservedRunningTime="2026-04-24 14:25:51.743178082 +0000 UTC m=+119.869053789" Apr 24 14:25:51.757606 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:51.757556 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wtc7l" podStartSLOduration=1.562871758 podStartE2EDuration="3.75753979s" podCreationTimestamp="2026-04-24 14:25:48 +0000 UTC" firstStartedPulling="2026-04-24 14:25:49.395769186 +0000 UTC m=+117.521644868" lastFinishedPulling="2026-04-24 14:25:51.590437213 +0000 UTC m=+119.716312900" observedRunningTime="2026-04-24 14:25:51.75644196 +0000 UTC m=+119.882317667" watchObservedRunningTime="2026-04-24 14:25:51.75753979 +0000 UTC m=+119.883415476" Apr 24 14:25:52.466908 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:52.466873 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/788ee4fa-ade7-40eb-ba34-7fa69f106caf-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-p284p\" (UID: \"788ee4fa-ade7-40eb-ba34-7fa69f106caf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p284p" Apr 24 14:25:52.467087 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:25:52.467024 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 14:25:52.467136 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:25:52.467089 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/788ee4fa-ade7-40eb-ba34-7fa69f106caf-cluster-monitoring-operator-tls podName:788ee4fa-ade7-40eb-ba34-7fa69f106caf nodeName:}" failed. No retries permitted until 2026-04-24 14:25:56.467073212 +0000 UTC m=+124.592948898 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/788ee4fa-ade7-40eb-ba34-7fa69f106caf-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-p284p" (UID: "788ee4fa-ade7-40eb-ba34-7fa69f106caf") : secret "cluster-monitoring-operator-tls" not found Apr 24 14:25:52.668165 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:52.668131 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/02995ee7-a76c-4aff-b87c-83b4540741cb-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-kmprr\" (UID: \"02995ee7-a76c-4aff-b87c-83b4540741cb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kmprr" Apr 24 14:25:52.668322 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:25:52.668236 2570 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 14:25:52.668322 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:25:52.668286 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02995ee7-a76c-4aff-b87c-83b4540741cb-samples-operator-tls podName:02995ee7-a76c-4aff-b87c-83b4540741cb nodeName:}" failed. No retries permitted until 2026-04-24 14:25:56.668273815 +0000 UTC m=+124.794149499 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/02995ee7-a76c-4aff-b87c-83b4540741cb-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-kmprr" (UID: "02995ee7-a76c-4aff-b87c-83b4540741cb") : secret "samples-operator-tls" not found Apr 24 14:25:53.393097 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:53.393064 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-w6fhd"] Apr 24 14:25:53.395485 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:53.395464 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w6fhd" Apr 24 14:25:53.398095 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:53.398075 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 24 14:25:53.399227 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:53.399208 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 24 14:25:53.399287 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:53.399208 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-6z652\"" Apr 24 14:25:53.406711 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:53.406672 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-w6fhd"] Apr 24 14:25:53.473945 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:53.473913 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlks9\" (UniqueName: \"kubernetes.io/projected/eb780e77-4955-4615-851f-ddc719d4d780-kube-api-access-tlks9\") pod \"migrator-74bb7799d9-w6fhd\" (UID: \"eb780e77-4955-4615-851f-ddc719d4d780\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w6fhd" Apr 24 14:25:53.574501 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:53.574458 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tlks9\" (UniqueName: \"kubernetes.io/projected/eb780e77-4955-4615-851f-ddc719d4d780-kube-api-access-tlks9\") pod \"migrator-74bb7799d9-w6fhd\" (UID: \"eb780e77-4955-4615-851f-ddc719d4d780\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w6fhd" Apr 24 14:25:53.582820 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:53.582796 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlks9\" (UniqueName: \"kubernetes.io/projected/eb780e77-4955-4615-851f-ddc719d4d780-kube-api-access-tlks9\") pod \"migrator-74bb7799d9-w6fhd\" (UID: \"eb780e77-4955-4615-851f-ddc719d4d780\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w6fhd" Apr 24 14:25:53.704165 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:53.704076 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w6fhd" Apr 24 14:25:53.819286 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:53.819255 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-w6fhd"] Apr 24 14:25:54.158297 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:54.158269 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7l5sm_67732aa7-95a9-4a96-9e40-a2a525b77a52/dns-node-resolver/0.log" Apr 24 14:25:54.736232 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:54.736194 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w6fhd" event={"ID":"eb780e77-4955-4615-851f-ddc719d4d780","Type":"ContainerStarted","Data":"9860c7cecd2b8391dc1029fd52e28932eb6b08730076a54f1c848007d375918d"} Apr 24 14:25:54.959228 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:54.959192 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-znhzw"] Apr 24 14:25:54.959453 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:54.959437 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-629tj_9bc5ec31-bf4b-46de-abb2-21a96bd6160a/node-ca/0.log" Apr 24 14:25:54.961164 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:54.961147 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-znhzw" Apr 24 14:25:54.963556 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:54.963537 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 24 14:25:54.964944 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:54.964921 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-g8tt9\"" Apr 24 14:25:54.964944 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:54.964936 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 24 14:25:54.965077 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:54.964928 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 24 14:25:54.965077 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:54.964941 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 24 14:25:54.974347 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:54.974328 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-znhzw"] Apr 24 14:25:55.086238 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:55.086150 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fa6eedad-9a08-47d5-a23c-7abcb8450548-signing-key\") pod \"service-ca-865cb79987-znhzw\" (UID: \"fa6eedad-9a08-47d5-a23c-7abcb8450548\") " pod="openshift-service-ca/service-ca-865cb79987-znhzw" Apr 24 14:25:55.086238 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:55.086218 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb2pg\" (UniqueName: \"kubernetes.io/projected/fa6eedad-9a08-47d5-a23c-7abcb8450548-kube-api-access-wb2pg\") pod \"service-ca-865cb79987-znhzw\" (UID: \"fa6eedad-9a08-47d5-a23c-7abcb8450548\") " pod="openshift-service-ca/service-ca-865cb79987-znhzw" Apr 24 14:25:55.086417 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:55.086282 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fa6eedad-9a08-47d5-a23c-7abcb8450548-signing-cabundle\") pod \"service-ca-865cb79987-znhzw\" (UID: \"fa6eedad-9a08-47d5-a23c-7abcb8450548\") " pod="openshift-service-ca/service-ca-865cb79987-znhzw" Apr 24 14:25:55.187331 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:55.187297 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wb2pg\" (UniqueName: \"kubernetes.io/projected/fa6eedad-9a08-47d5-a23c-7abcb8450548-kube-api-access-wb2pg\") pod \"service-ca-865cb79987-znhzw\" (UID: \"fa6eedad-9a08-47d5-a23c-7abcb8450548\") " pod="openshift-service-ca/service-ca-865cb79987-znhzw" Apr 24 14:25:55.187471 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:55.187368 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fa6eedad-9a08-47d5-a23c-7abcb8450548-signing-cabundle\") pod \"service-ca-865cb79987-znhzw\" (UID: \"fa6eedad-9a08-47d5-a23c-7abcb8450548\") " pod="openshift-service-ca/service-ca-865cb79987-znhzw" Apr 24 14:25:55.187587 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:55.187565 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fa6eedad-9a08-47d5-a23c-7abcb8450548-signing-key\") pod \"service-ca-865cb79987-znhzw\" (UID: \"fa6eedad-9a08-47d5-a23c-7abcb8450548\") " pod="openshift-service-ca/service-ca-865cb79987-znhzw" Apr 24 14:25:55.188035 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:55.188018 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fa6eedad-9a08-47d5-a23c-7abcb8450548-signing-cabundle\") pod \"service-ca-865cb79987-znhzw\" (UID: \"fa6eedad-9a08-47d5-a23c-7abcb8450548\") " pod="openshift-service-ca/service-ca-865cb79987-znhzw" Apr 24 14:25:55.190115 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:55.190095 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fa6eedad-9a08-47d5-a23c-7abcb8450548-signing-key\") pod \"service-ca-865cb79987-znhzw\" (UID: \"fa6eedad-9a08-47d5-a23c-7abcb8450548\") " pod="openshift-service-ca/service-ca-865cb79987-znhzw" Apr 24 14:25:55.196264 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:55.196240 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb2pg\" (UniqueName: \"kubernetes.io/projected/fa6eedad-9a08-47d5-a23c-7abcb8450548-kube-api-access-wb2pg\") pod \"service-ca-865cb79987-znhzw\" (UID: \"fa6eedad-9a08-47d5-a23c-7abcb8450548\") " pod="openshift-service-ca/service-ca-865cb79987-znhzw" Apr 24 14:25:55.269339 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:55.269315 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-znhzw" Apr 24 14:25:55.384609 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:55.384579 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-znhzw"] Apr 24 14:25:55.388260 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:25:55.388229 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa6eedad_9a08_47d5_a23c_7abcb8450548.slice/crio-a089c1de0f43682de945589f50dd7c9c360b7b6e4219ab3a6d6207e33ee8623d WatchSource:0}: Error finding container a089c1de0f43682de945589f50dd7c9c360b7b6e4219ab3a6d6207e33ee8623d: Status 404 returned error can't find the container with id a089c1de0f43682de945589f50dd7c9c360b7b6e4219ab3a6d6207e33ee8623d Apr 24 14:25:55.741190 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:55.741097 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-znhzw" event={"ID":"fa6eedad-9a08-47d5-a23c-7abcb8450548","Type":"ContainerStarted","Data":"a089c1de0f43682de945589f50dd7c9c360b7b6e4219ab3a6d6207e33ee8623d"} Apr 24 14:25:55.742444 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:55.742421 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w6fhd" event={"ID":"eb780e77-4955-4615-851f-ddc719d4d780","Type":"ContainerStarted","Data":"715fad5d59b27e45720c01dca2d2e367dae31460f4a7c97391be32e3c85bf594"} Apr 24 14:25:55.742509 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:55.742452 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w6fhd" event={"ID":"eb780e77-4955-4615-851f-ddc719d4d780","Type":"ContainerStarted","Data":"6567d92b837388e195c100ab7ec94b356901d5696e24201e419c9c0809e978c1"} Apr 24 14:25:55.758373 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:55.758320 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w6fhd" podStartSLOduration=1.771911657 podStartE2EDuration="2.758308046s" podCreationTimestamp="2026-04-24 14:25:53 +0000 UTC" firstStartedPulling="2026-04-24 14:25:53.8270767 +0000 UTC m=+121.952952383" lastFinishedPulling="2026-04-24 14:25:54.81347309 +0000 UTC m=+122.939348772" observedRunningTime="2026-04-24 14:25:55.757521213 +0000 UTC m=+123.883396900" watchObservedRunningTime="2026-04-24 14:25:55.758308046 +0000 UTC m=+123.884183751" Apr 24 14:25:56.499459 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:56.499425 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/788ee4fa-ade7-40eb-ba34-7fa69f106caf-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-p284p\" (UID: \"788ee4fa-ade7-40eb-ba34-7fa69f106caf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p284p" Apr 24 14:25:56.499646 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:25:56.499581 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 14:25:56.499700 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:25:56.499675 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/788ee4fa-ade7-40eb-ba34-7fa69f106caf-cluster-monitoring-operator-tls podName:788ee4fa-ade7-40eb-ba34-7fa69f106caf nodeName:}" failed. No retries permitted until 2026-04-24 14:26:04.499658544 +0000 UTC m=+132.625534228 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/788ee4fa-ade7-40eb-ba34-7fa69f106caf-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-p284p" (UID: "788ee4fa-ade7-40eb-ba34-7fa69f106caf") : secret "cluster-monitoring-operator-tls" not found Apr 24 14:25:56.701589 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:56.701554 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/02995ee7-a76c-4aff-b87c-83b4540741cb-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-kmprr\" (UID: \"02995ee7-a76c-4aff-b87c-83b4540741cb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kmprr" Apr 24 14:25:56.701771 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:25:56.701731 2570 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 14:25:56.701830 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:25:56.701810 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02995ee7-a76c-4aff-b87c-83b4540741cb-samples-operator-tls podName:02995ee7-a76c-4aff-b87c-83b4540741cb nodeName:}" failed. No retries permitted until 2026-04-24 14:26:04.701788649 +0000 UTC m=+132.827664350 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/02995ee7-a76c-4aff-b87c-83b4540741cb-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-kmprr" (UID: "02995ee7-a76c-4aff-b87c-83b4540741cb") : secret "samples-operator-tls" not found Apr 24 14:25:57.748513 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:57.748471 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-znhzw" event={"ID":"fa6eedad-9a08-47d5-a23c-7abcb8450548","Type":"ContainerStarted","Data":"689b7d9fe5a93b482fe1c6c9223cfa0213ddc77a7b16025c9ca31efa185fa442"} Apr 24 14:25:57.767968 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:25:57.767909 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-znhzw" podStartSLOduration=2.203927524 podStartE2EDuration="3.767891485s" podCreationTimestamp="2026-04-24 14:25:54 +0000 UTC" firstStartedPulling="2026-04-24 14:25:55.390038164 +0000 UTC m=+123.515913848" lastFinishedPulling="2026-04-24 14:25:56.954002114 +0000 UTC m=+125.079877809" observedRunningTime="2026-04-24 14:25:57.767338919 +0000 UTC m=+125.893214633" watchObservedRunningTime="2026-04-24 14:25:57.767891485 +0000 UTC m=+125.893767191" Apr 24 14:26:02.244446 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:02.244407 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90958440-ae13-4f74-8dc0-73b738f79139-metrics-certs\") pod \"network-metrics-daemon-f5bf4\" (UID: \"90958440-ae13-4f74-8dc0-73b738f79139\") " pod="openshift-multus/network-metrics-daemon-f5bf4" Apr 24 14:26:02.244991 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:26:02.244585 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 14:26:02.244991 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:26:02.244691 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90958440-ae13-4f74-8dc0-73b738f79139-metrics-certs podName:90958440-ae13-4f74-8dc0-73b738f79139 nodeName:}" failed. No retries permitted until 2026-04-24 14:28:04.244670749 +0000 UTC m=+252.370546448 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/90958440-ae13-4f74-8dc0-73b738f79139-metrics-certs") pod "network-metrics-daemon-f5bf4" (UID: "90958440-ae13-4f74-8dc0-73b738f79139") : secret "metrics-daemon-secret" not found Apr 24 14:26:04.563667 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:04.563615 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/788ee4fa-ade7-40eb-ba34-7fa69f106caf-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-p284p\" (UID: \"788ee4fa-ade7-40eb-ba34-7fa69f106caf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p284p" Apr 24 14:26:04.564047 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:26:04.563794 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 14:26:04.564047 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:26:04.563877 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/788ee4fa-ade7-40eb-ba34-7fa69f106caf-cluster-monitoring-operator-tls podName:788ee4fa-ade7-40eb-ba34-7fa69f106caf nodeName:}" failed. No retries permitted until 2026-04-24 14:26:20.563855235 +0000 UTC m=+148.689730925 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/788ee4fa-ade7-40eb-ba34-7fa69f106caf-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-p284p" (UID: "788ee4fa-ade7-40eb-ba34-7fa69f106caf") : secret "cluster-monitoring-operator-tls" not found Apr 24 14:26:04.765232 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:04.765197 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/02995ee7-a76c-4aff-b87c-83b4540741cb-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-kmprr\" (UID: \"02995ee7-a76c-4aff-b87c-83b4540741cb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kmprr" Apr 24 14:26:04.767836 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:04.767806 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/02995ee7-a76c-4aff-b87c-83b4540741cb-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-kmprr\" (UID: \"02995ee7-a76c-4aff-b87c-83b4540741cb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kmprr" Apr 24 14:26:04.871826 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:04.871795 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-49tt2\"" Apr 24 14:26:04.880083 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:04.880070 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kmprr" Apr 24 14:26:04.996945 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:04.996916 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kmprr"] Apr 24 14:26:05.769768 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:05.769731 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kmprr" event={"ID":"02995ee7-a76c-4aff-b87c-83b4540741cb","Type":"ContainerStarted","Data":"65eb1713e8b0fb80ba4055f2578d56d0fb028681e0960604d48b85b58e65aed8"} Apr 24 14:26:06.775041 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:06.775009 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kmprr" event={"ID":"02995ee7-a76c-4aff-b87c-83b4540741cb","Type":"ContainerStarted","Data":"1b8c2444e2adac07741791950e581039a468cfa6db55930040239f0de227eaef"} Apr 24 14:26:06.775041 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:06.775046 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kmprr" event={"ID":"02995ee7-a76c-4aff-b87c-83b4540741cb","Type":"ContainerStarted","Data":"0087510ce9c29d2d7d815627efdae7353e9f7fa1ac5d1be9c9d47cd794defe66"} Apr 24 14:26:06.793215 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:06.793170 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kmprr" podStartSLOduration=17.235322542 podStartE2EDuration="18.793157492s" podCreationTimestamp="2026-04-24 14:25:48 +0000 UTC" firstStartedPulling="2026-04-24 14:26:05.039088735 +0000 UTC m=+133.164964417" lastFinishedPulling="2026-04-24 14:26:06.59692367 +0000 UTC m=+134.722799367" observedRunningTime="2026-04-24 14:26:06.791905646 +0000 UTC m=+134.917781352" watchObservedRunningTime="2026-04-24 14:26:06.793157492 +0000 UTC m=+134.919033238" Apr 24 14:26:17.173347 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.173318 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-9z6hz"] Apr 24 14:26:17.176612 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.176595 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9z6hz" Apr 24 14:26:17.178992 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.178972 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-jk68x\"" Apr 24 14:26:17.179115 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.179097 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 14:26:17.180375 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.180355 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 14:26:17.189313 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.189296 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9z6hz"] Apr 24 14:26:17.247332 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.247304 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7cd9495d59-q64t7"] Apr 24 14:26:17.250217 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.250199 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7cd9495d59-q64t7" Apr 24 14:26:17.253738 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.253712 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 14:26:17.253896 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.253829 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 14:26:17.253896 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.253829 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-xtmb2\"" Apr 24 14:26:17.254018 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.253948 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 14:26:17.259907 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.259889 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 14:26:17.270042 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.270022 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7cd9495d59-q64t7"] Apr 24 14:26:17.360789 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.360758 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2a9e2553-3905-4307-bccc-cf5c6355779f-image-registry-private-configuration\") pod \"image-registry-7cd9495d59-q64t7\" (UID: \"2a9e2553-3905-4307-bccc-cf5c6355779f\") " pod="openshift-image-registry/image-registry-7cd9495d59-q64t7" Apr 24 14:26:17.360942 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.360810 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/445ae754-e43b-4a53-8625-8ecbb1ab28b1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9z6hz\" (UID: \"445ae754-e43b-4a53-8625-8ecbb1ab28b1\") " pod="openshift-insights/insights-runtime-extractor-9z6hz" Apr 24 14:26:17.360942 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.360832 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/445ae754-e43b-4a53-8625-8ecbb1ab28b1-data-volume\") pod \"insights-runtime-extractor-9z6hz\" (UID: \"445ae754-e43b-4a53-8625-8ecbb1ab28b1\") " pod="openshift-insights/insights-runtime-extractor-9z6hz" Apr 24 14:26:17.360942 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.360870 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/445ae754-e43b-4a53-8625-8ecbb1ab28b1-crio-socket\") pod \"insights-runtime-extractor-9z6hz\" (UID: \"445ae754-e43b-4a53-8625-8ecbb1ab28b1\") " pod="openshift-insights/insights-runtime-extractor-9z6hz" Apr 24 14:26:17.360942 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.360886 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxq48\" (UniqueName: \"kubernetes.io/projected/445ae754-e43b-4a53-8625-8ecbb1ab28b1-kube-api-access-mxq48\") pod \"insights-runtime-extractor-9z6hz\" (UID: \"445ae754-e43b-4a53-8625-8ecbb1ab28b1\") " pod="openshift-insights/insights-runtime-extractor-9z6hz" Apr 24 14:26:17.361111 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.360958 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2a9e2553-3905-4307-bccc-cf5c6355779f-registry-tls\") pod \"image-registry-7cd9495d59-q64t7\" (UID: \"2a9e2553-3905-4307-bccc-cf5c6355779f\") " pod="openshift-image-registry/image-registry-7cd9495d59-q64t7" Apr 24 14:26:17.361111 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.361005 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/445ae754-e43b-4a53-8625-8ecbb1ab28b1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9z6hz\" (UID: \"445ae754-e43b-4a53-8625-8ecbb1ab28b1\") " pod="openshift-insights/insights-runtime-extractor-9z6hz" Apr 24 14:26:17.361111 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.361043 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2a9e2553-3905-4307-bccc-cf5c6355779f-registry-certificates\") pod \"image-registry-7cd9495d59-q64t7\" (UID: \"2a9e2553-3905-4307-bccc-cf5c6355779f\") " pod="openshift-image-registry/image-registry-7cd9495d59-q64t7" Apr 24 14:26:17.361111 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.361062 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2a9e2553-3905-4307-bccc-cf5c6355779f-trusted-ca\") pod \"image-registry-7cd9495d59-q64t7\" (UID: \"2a9e2553-3905-4307-bccc-cf5c6355779f\") " pod="openshift-image-registry/image-registry-7cd9495d59-q64t7" Apr 24 14:26:17.361111 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.361078 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2a9e2553-3905-4307-bccc-cf5c6355779f-installation-pull-secrets\") pod \"image-registry-7cd9495d59-q64t7\" (UID: \"2a9e2553-3905-4307-bccc-cf5c6355779f\") " pod="openshift-image-registry/image-registry-7cd9495d59-q64t7" Apr 24 14:26:17.361111 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.361094 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2a9e2553-3905-4307-bccc-cf5c6355779f-bound-sa-token\") pod \"image-registry-7cd9495d59-q64t7\" (UID: \"2a9e2553-3905-4307-bccc-cf5c6355779f\") " pod="openshift-image-registry/image-registry-7cd9495d59-q64t7" Apr 24 14:26:17.361301 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.361139 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2a9e2553-3905-4307-bccc-cf5c6355779f-ca-trust-extracted\") pod \"image-registry-7cd9495d59-q64t7\" (UID: \"2a9e2553-3905-4307-bccc-cf5c6355779f\") " pod="openshift-image-registry/image-registry-7cd9495d59-q64t7" Apr 24 14:26:17.361301 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.361186 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxzs4\" (UniqueName: \"kubernetes.io/projected/2a9e2553-3905-4307-bccc-cf5c6355779f-kube-api-access-bxzs4\") pod \"image-registry-7cd9495d59-q64t7\" (UID: \"2a9e2553-3905-4307-bccc-cf5c6355779f\") " pod="openshift-image-registry/image-registry-7cd9495d59-q64t7" Apr 24 14:26:17.462196 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.462113 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bxzs4\" (UniqueName: \"kubernetes.io/projected/2a9e2553-3905-4307-bccc-cf5c6355779f-kube-api-access-bxzs4\") pod \"image-registry-7cd9495d59-q64t7\" (UID: \"2a9e2553-3905-4307-bccc-cf5c6355779f\") " pod="openshift-image-registry/image-registry-7cd9495d59-q64t7" Apr 24 14:26:17.462196 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.462156 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2a9e2553-3905-4307-bccc-cf5c6355779f-image-registry-private-configuration\") pod \"image-registry-7cd9495d59-q64t7\" (UID: \"2a9e2553-3905-4307-bccc-cf5c6355779f\") " pod="openshift-image-registry/image-registry-7cd9495d59-q64t7" Apr 24 14:26:17.462373 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.462307 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/445ae754-e43b-4a53-8625-8ecbb1ab28b1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9z6hz\" (UID: \"445ae754-e43b-4a53-8625-8ecbb1ab28b1\") " pod="openshift-insights/insights-runtime-extractor-9z6hz" Apr 24 14:26:17.462373 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.462343 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/445ae754-e43b-4a53-8625-8ecbb1ab28b1-data-volume\") pod \"insights-runtime-extractor-9z6hz\" (UID: \"445ae754-e43b-4a53-8625-8ecbb1ab28b1\") " pod="openshift-insights/insights-runtime-extractor-9z6hz" Apr 24 14:26:17.462467 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.462390 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/445ae754-e43b-4a53-8625-8ecbb1ab28b1-crio-socket\") pod \"insights-runtime-extractor-9z6hz\" (UID: \"445ae754-e43b-4a53-8625-8ecbb1ab28b1\") " pod="openshift-insights/insights-runtime-extractor-9z6hz" Apr 24 14:26:17.462467 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.462412 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxq48\" (UniqueName: \"kubernetes.io/projected/445ae754-e43b-4a53-8625-8ecbb1ab28b1-kube-api-access-mxq48\") pod \"insights-runtime-extractor-9z6hz\" (UID: \"445ae754-e43b-4a53-8625-8ecbb1ab28b1\") " pod="openshift-insights/insights-runtime-extractor-9z6hz" Apr 24 14:26:17.462467 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.462440 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2a9e2553-3905-4307-bccc-cf5c6355779f-registry-tls\") pod \"image-registry-7cd9495d59-q64t7\" (UID: \"2a9e2553-3905-4307-bccc-cf5c6355779f\") " pod="openshift-image-registry/image-registry-7cd9495d59-q64t7" Apr 24 14:26:17.462605 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.462468 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/445ae754-e43b-4a53-8625-8ecbb1ab28b1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9z6hz\" (UID: \"445ae754-e43b-4a53-8625-8ecbb1ab28b1\") " pod="openshift-insights/insights-runtime-extractor-9z6hz" Apr 24 14:26:17.462605 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.462471 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/445ae754-e43b-4a53-8625-8ecbb1ab28b1-crio-socket\") pod \"insights-runtime-extractor-9z6hz\" (UID: \"445ae754-e43b-4a53-8625-8ecbb1ab28b1\") " pod="openshift-insights/insights-runtime-extractor-9z6hz" Apr 24 14:26:17.462605 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.462503 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2a9e2553-3905-4307-bccc-cf5c6355779f-registry-certificates\") pod \"image-registry-7cd9495d59-q64t7\" (UID: \"2a9e2553-3905-4307-bccc-cf5c6355779f\") " pod="openshift-image-registry/image-registry-7cd9495d59-q64t7" Apr 24 14:26:17.462605 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.462533 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2a9e2553-3905-4307-bccc-cf5c6355779f-trusted-ca\") pod \"image-registry-7cd9495d59-q64t7\" (UID: \"2a9e2553-3905-4307-bccc-cf5c6355779f\") " pod="openshift-image-registry/image-registry-7cd9495d59-q64t7" Apr 24 14:26:17.462605 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.462560 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2a9e2553-3905-4307-bccc-cf5c6355779f-installation-pull-secrets\") pod \"image-registry-7cd9495d59-q64t7\" (UID: \"2a9e2553-3905-4307-bccc-cf5c6355779f\") " pod="openshift-image-registry/image-registry-7cd9495d59-q64t7" Apr 24 14:26:17.462605 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.462584 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2a9e2553-3905-4307-bccc-cf5c6355779f-bound-sa-token\") pod \"image-registry-7cd9495d59-q64t7\" (UID: \"2a9e2553-3905-4307-bccc-cf5c6355779f\") " pod="openshift-image-registry/image-registry-7cd9495d59-q64t7" Apr 24 14:26:17.462927 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.462611 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2a9e2553-3905-4307-bccc-cf5c6355779f-ca-trust-extracted\") pod \"image-registry-7cd9495d59-q64t7\" (UID: \"2a9e2553-3905-4307-bccc-cf5c6355779f\") " pod="openshift-image-registry/image-registry-7cd9495d59-q64t7" Apr 24 14:26:17.462927 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.462782 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/445ae754-e43b-4a53-8625-8ecbb1ab28b1-data-volume\") pod \"insights-runtime-extractor-9z6hz\" (UID: \"445ae754-e43b-4a53-8625-8ecbb1ab28b1\") " pod="openshift-insights/insights-runtime-extractor-9z6hz" Apr 24 14:26:17.463034 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.463017 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/445ae754-e43b-4a53-8625-8ecbb1ab28b1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9z6hz\" (UID: \"445ae754-e43b-4a53-8625-8ecbb1ab28b1\") " pod="openshift-insights/insights-runtime-extractor-9z6hz" Apr 24 14:26:17.463083 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.463070 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2a9e2553-3905-4307-bccc-cf5c6355779f-ca-trust-extracted\") pod \"image-registry-7cd9495d59-q64t7\" (UID: \"2a9e2553-3905-4307-bccc-cf5c6355779f\") " pod="openshift-image-registry/image-registry-7cd9495d59-q64t7" Apr 24 14:26:17.463607 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.463586 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2a9e2553-3905-4307-bccc-cf5c6355779f-registry-certificates\") pod \"image-registry-7cd9495d59-q64t7\" (UID: \"2a9e2553-3905-4307-bccc-cf5c6355779f\") " pod="openshift-image-registry/image-registry-7cd9495d59-q64t7" Apr 24 14:26:17.463868 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.463842 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2a9e2553-3905-4307-bccc-cf5c6355779f-trusted-ca\") pod \"image-registry-7cd9495d59-q64t7\" (UID: \"2a9e2553-3905-4307-bccc-cf5c6355779f\") " pod="openshift-image-registry/image-registry-7cd9495d59-q64t7" Apr 24 14:26:17.465222 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.465195 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2a9e2553-3905-4307-bccc-cf5c6355779f-installation-pull-secrets\") pod \"image-registry-7cd9495d59-q64t7\" (UID: \"2a9e2553-3905-4307-bccc-cf5c6355779f\") " pod="openshift-image-registry/image-registry-7cd9495d59-q64t7" Apr 24 14:26:17.465386 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.465370 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2a9e2553-3905-4307-bccc-cf5c6355779f-registry-tls\") pod \"image-registry-7cd9495d59-q64t7\" (UID: \"2a9e2553-3905-4307-bccc-cf5c6355779f\") " pod="openshift-image-registry/image-registry-7cd9495d59-q64t7" Apr 24 14:26:17.465386 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.465379 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2a9e2553-3905-4307-bccc-cf5c6355779f-image-registry-private-configuration\") pod \"image-registry-7cd9495d59-q64t7\" (UID: \"2a9e2553-3905-4307-bccc-cf5c6355779f\") " pod="openshift-image-registry/image-registry-7cd9495d59-q64t7" Apr 24 14:26:17.465460 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.465399 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/445ae754-e43b-4a53-8625-8ecbb1ab28b1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9z6hz\" (UID: \"445ae754-e43b-4a53-8625-8ecbb1ab28b1\") " pod="openshift-insights/insights-runtime-extractor-9z6hz" Apr 24 14:26:17.475539 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.475513 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2a9e2553-3905-4307-bccc-cf5c6355779f-bound-sa-token\") pod \"image-registry-7cd9495d59-q64t7\" (UID: \"2a9e2553-3905-4307-bccc-cf5c6355779f\") " pod="openshift-image-registry/image-registry-7cd9495d59-q64t7" Apr 24 14:26:17.475667 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.475564 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxq48\" (UniqueName: \"kubernetes.io/projected/445ae754-e43b-4a53-8625-8ecbb1ab28b1-kube-api-access-mxq48\") pod \"insights-runtime-extractor-9z6hz\" (UID: \"445ae754-e43b-4a53-8625-8ecbb1ab28b1\") " pod="openshift-insights/insights-runtime-extractor-9z6hz" Apr 24 14:26:17.475900 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.475879 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxzs4\" (UniqueName: \"kubernetes.io/projected/2a9e2553-3905-4307-bccc-cf5c6355779f-kube-api-access-bxzs4\") pod \"image-registry-7cd9495d59-q64t7\" (UID: \"2a9e2553-3905-4307-bccc-cf5c6355779f\") " pod="openshift-image-registry/image-registry-7cd9495d59-q64t7" Apr 24 14:26:17.485563 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.485544 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9z6hz" Apr 24 14:26:17.558638 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.558594 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7cd9495d59-q64t7" Apr 24 14:26:17.607336 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.607248 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9z6hz"] Apr 24 14:26:17.611588 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:26:17.611550 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod445ae754_e43b_4a53_8625_8ecbb1ab28b1.slice/crio-e64990068c6ab4bd1be20b135fb45fc91c8cf063de98d89a78ff3a5405da4182 WatchSource:0}: Error finding container e64990068c6ab4bd1be20b135fb45fc91c8cf063de98d89a78ff3a5405da4182: Status 404 returned error can't find the container with id e64990068c6ab4bd1be20b135fb45fc91c8cf063de98d89a78ff3a5405da4182 Apr 24 14:26:17.684796 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.684764 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7cd9495d59-q64t7"] Apr 24 14:26:17.688571 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:26:17.688540 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a9e2553_3905_4307_bccc_cf5c6355779f.slice/crio-562c37443c92191748705bf7eecf658b3d3eab8ab3ec7947047a45bd111b3df5 WatchSource:0}: Error finding container 562c37443c92191748705bf7eecf658b3d3eab8ab3ec7947047a45bd111b3df5: Status 404 returned error can't find the container with id 562c37443c92191748705bf7eecf658b3d3eab8ab3ec7947047a45bd111b3df5 Apr 24 14:26:17.805692 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.805658 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7cd9495d59-q64t7" event={"ID":"2a9e2553-3905-4307-bccc-cf5c6355779f","Type":"ContainerStarted","Data":"2b81f661cbed82e4ceb7bf395c7a83c27ca5d92f2b496b21c6cb3c749ac5602a"} Apr 24 14:26:17.805873 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.805701 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7cd9495d59-q64t7" event={"ID":"2a9e2553-3905-4307-bccc-cf5c6355779f","Type":"ContainerStarted","Data":"562c37443c92191748705bf7eecf658b3d3eab8ab3ec7947047a45bd111b3df5"} Apr 24 14:26:17.805873 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.805762 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7cd9495d59-q64t7" Apr 24 14:26:17.807030 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.807007 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9z6hz" event={"ID":"445ae754-e43b-4a53-8625-8ecbb1ab28b1","Type":"ContainerStarted","Data":"c3ac5a11a2c37c270a68b426fe31b433d57f270029faf23d35483521f976b117"} Apr 24 14:26:17.807139 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.807036 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9z6hz" event={"ID":"445ae754-e43b-4a53-8625-8ecbb1ab28b1","Type":"ContainerStarted","Data":"e64990068c6ab4bd1be20b135fb45fc91c8cf063de98d89a78ff3a5405da4182"} Apr 24 14:26:17.825617 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:17.825575 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7cd9495d59-q64t7" podStartSLOduration=0.825559549 podStartE2EDuration="825.559549ms" podCreationTimestamp="2026-04-24 14:26:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:26:17.824304015 +0000 UTC m=+145.950179720" watchObservedRunningTime="2026-04-24 14:26:17.825559549 +0000 UTC m=+145.951435254" Apr 24 14:26:18.812020 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:18.811980 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9z6hz" event={"ID":"445ae754-e43b-4a53-8625-8ecbb1ab28b1","Type":"ContainerStarted","Data":"c1f2b9f02d42fa552dbd21304adfe4dd35ff0ed4ab07d4acb7ca33fb589f5d9e"} Apr 24 14:26:20.586325 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:20.586263 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/788ee4fa-ade7-40eb-ba34-7fa69f106caf-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-p284p\" (UID: \"788ee4fa-ade7-40eb-ba34-7fa69f106caf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p284p" Apr 24 14:26:20.588830 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:20.588809 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/788ee4fa-ade7-40eb-ba34-7fa69f106caf-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-p284p\" (UID: \"788ee4fa-ade7-40eb-ba34-7fa69f106caf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p284p" Apr 24 14:26:20.819320 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:20.819282 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9z6hz" event={"ID":"445ae754-e43b-4a53-8625-8ecbb1ab28b1","Type":"ContainerStarted","Data":"93841a33a3da6559191e51a7985e13b76a5d00e4a9e3ae03a0a17c3ccb0632c9"} Apr 24 14:26:20.835747 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:20.835706 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-9z6hz" podStartSLOduration=1.723039405 podStartE2EDuration="3.835692147s" podCreationTimestamp="2026-04-24 14:26:17 +0000 UTC" firstStartedPulling="2026-04-24 14:26:17.668848487 +0000 UTC m=+145.794724170" lastFinishedPulling="2026-04-24 14:26:19.781501229 +0000 UTC m=+147.907376912" observedRunningTime="2026-04-24 14:26:20.834980003 +0000 UTC m=+148.960855707" watchObservedRunningTime="2026-04-24 14:26:20.835692147 +0000 UTC m=+148.961567851" Apr 24 14:26:20.857684 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:20.857659 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-bmxsf\"" Apr 24 14:26:20.865950 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:20.865929 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p284p" Apr 24 14:26:20.978832 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:20.978800 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-p284p"] Apr 24 14:26:20.982051 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:26:20.982010 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod788ee4fa_ade7_40eb_ba34_7fa69f106caf.slice/crio-950d2f4fe9218b558c100c834576124a3a60cee968c22f409873ca6a88a043f6 WatchSource:0}: Error finding container 950d2f4fe9218b558c100c834576124a3a60cee968c22f409873ca6a88a043f6: Status 404 returned error can't find the container with id 950d2f4fe9218b558c100c834576124a3a60cee968c22f409873ca6a88a043f6 Apr 24 14:26:21.823841 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:21.823799 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p284p" event={"ID":"788ee4fa-ade7-40eb-ba34-7fa69f106caf","Type":"ContainerStarted","Data":"950d2f4fe9218b558c100c834576124a3a60cee968c22f409873ca6a88a043f6"} Apr 24 14:26:22.829106 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:22.828370 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p284p" event={"ID":"788ee4fa-ade7-40eb-ba34-7fa69f106caf","Type":"ContainerStarted","Data":"301b2c3b6688631120ec7b6c5a0fffb2fd95f1abb4ffe4804770c20c737ff85a"} Apr 24 14:26:22.847157 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:22.847098 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p284p" podStartSLOduration=33.256598134 podStartE2EDuration="34.847081182s" podCreationTimestamp="2026-04-24 14:25:48 +0000 UTC" firstStartedPulling="2026-04-24 14:26:20.983883192 +0000 UTC m=+149.109758874" lastFinishedPulling="2026-04-24 14:26:22.574366224 +0000 UTC m=+150.700241922" observedRunningTime="2026-04-24 14:26:22.84701088 +0000 UTC m=+150.972886636" watchObservedRunningTime="2026-04-24 14:26:22.847081182 +0000 UTC m=+150.972956887" Apr 24 14:26:28.809861 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:26:28.809813 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-hs6xt" podUID="2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e" Apr 24 14:26:28.826036 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:26:28.826013 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-lkncg" podUID="7ab7e98f-94b0-4d9b-9d46-6666f05fb86a" Apr 24 14:26:28.842120 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:28.842098 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hs6xt" Apr 24 14:26:30.439781 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:26:30.439739 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-f5bf4" podUID="90958440-ae13-4f74-8dc0-73b738f79139" Apr 24 14:26:30.547716 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:30.547676 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-s6ldx"] Apr 24 14:26:30.553201 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:30.553168 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-s6ldx" Apr 24 14:26:30.556826 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:30.556803 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 14:26:30.558074 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:30.557823 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 14:26:30.558074 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:30.557868 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 14:26:30.558074 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:30.557881 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 14:26:30.558333 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:30.558127 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-8xq7d\"" Apr 24 14:26:30.657741 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:30.657708 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f8125906-514c-4f62-9b0c-82f218af43f1-root\") pod \"node-exporter-s6ldx\" (UID: \"f8125906-514c-4f62-9b0c-82f218af43f1\") " pod="openshift-monitoring/node-exporter-s6ldx" Apr 24 14:26:30.657925 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:30.657767 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f8125906-514c-4f62-9b0c-82f218af43f1-node-exporter-textfile\") pod \"node-exporter-s6ldx\" (UID: \"f8125906-514c-4f62-9b0c-82f218af43f1\") " pod="openshift-monitoring/node-exporter-s6ldx" Apr 24 14:26:30.657925 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:30.657787 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f8125906-514c-4f62-9b0c-82f218af43f1-node-exporter-tls\") pod \"node-exporter-s6ldx\" (UID: \"f8125906-514c-4f62-9b0c-82f218af43f1\") " pod="openshift-monitoring/node-exporter-s6ldx" Apr 24 14:26:30.657925 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:30.657812 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f8125906-514c-4f62-9b0c-82f218af43f1-node-exporter-wtmp\") pod \"node-exporter-s6ldx\" (UID: \"f8125906-514c-4f62-9b0c-82f218af43f1\") " pod="openshift-monitoring/node-exporter-s6ldx" Apr 24 14:26:30.657925 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:30.657835 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f8125906-514c-4f62-9b0c-82f218af43f1-node-exporter-accelerators-collector-config\") pod \"node-exporter-s6ldx\" (UID: \"f8125906-514c-4f62-9b0c-82f218af43f1\") " pod="openshift-monitoring/node-exporter-s6ldx" Apr 24 14:26:30.657925 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:30.657869 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f8125906-514c-4f62-9b0c-82f218af43f1-metrics-client-ca\") pod \"node-exporter-s6ldx\" (UID: \"f8125906-514c-4f62-9b0c-82f218af43f1\") " pod="openshift-monitoring/node-exporter-s6ldx" Apr 24 14:26:30.658111 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:30.657930 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f8125906-514c-4f62-9b0c-82f218af43f1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-s6ldx\" (UID: \"f8125906-514c-4f62-9b0c-82f218af43f1\") " pod="openshift-monitoring/node-exporter-s6ldx" Apr 24 14:26:30.658111 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:30.657974 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f8125906-514c-4f62-9b0c-82f218af43f1-sys\") pod \"node-exporter-s6ldx\" (UID: \"f8125906-514c-4f62-9b0c-82f218af43f1\") " pod="openshift-monitoring/node-exporter-s6ldx" Apr 24 14:26:30.658111 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:30.658027 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzmv7\" (UniqueName: \"kubernetes.io/projected/f8125906-514c-4f62-9b0c-82f218af43f1-kube-api-access-bzmv7\") pod \"node-exporter-s6ldx\" (UID: \"f8125906-514c-4f62-9b0c-82f218af43f1\") " pod="openshift-monitoring/node-exporter-s6ldx" Apr 24 14:26:30.758935 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:30.758854 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f8125906-514c-4f62-9b0c-82f218af43f1-metrics-client-ca\") pod \"node-exporter-s6ldx\" (UID: \"f8125906-514c-4f62-9b0c-82f218af43f1\") " pod="openshift-monitoring/node-exporter-s6ldx" Apr 24 14:26:30.758935 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:30.758892 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f8125906-514c-4f62-9b0c-82f218af43f1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-s6ldx\" (UID: \"f8125906-514c-4f62-9b0c-82f218af43f1\") " pod="openshift-monitoring/node-exporter-s6ldx" Apr 24 14:26:30.758935 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:30.758910 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f8125906-514c-4f62-9b0c-82f218af43f1-sys\") pod \"node-exporter-s6ldx\" (UID: \"f8125906-514c-4f62-9b0c-82f218af43f1\") " pod="openshift-monitoring/node-exporter-s6ldx" Apr 24 14:26:30.759206 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:30.758966 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f8125906-514c-4f62-9b0c-82f218af43f1-sys\") pod \"node-exporter-s6ldx\" (UID: \"f8125906-514c-4f62-9b0c-82f218af43f1\") " pod="openshift-monitoring/node-exporter-s6ldx" Apr 24 14:26:30.759206 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:30.758979 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bzmv7\" (UniqueName: \"kubernetes.io/projected/f8125906-514c-4f62-9b0c-82f218af43f1-kube-api-access-bzmv7\") pod \"node-exporter-s6ldx\" (UID: \"f8125906-514c-4f62-9b0c-82f218af43f1\") " pod="openshift-monitoring/node-exporter-s6ldx" Apr 24 14:26:30.759206 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:30.759073 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f8125906-514c-4f62-9b0c-82f218af43f1-root\") pod \"node-exporter-s6ldx\" (UID: \"f8125906-514c-4f62-9b0c-82f218af43f1\") " pod="openshift-monitoring/node-exporter-s6ldx" Apr 24 14:26:30.759206 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:30.759127 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f8125906-514c-4f62-9b0c-82f218af43f1-node-exporter-textfile\") pod \"node-exporter-s6ldx\" (UID: \"f8125906-514c-4f62-9b0c-82f218af43f1\") " pod="openshift-monitoring/node-exporter-s6ldx" Apr 24 14:26:30.759206 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:30.759150 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f8125906-514c-4f62-9b0c-82f218af43f1-node-exporter-tls\") pod \"node-exporter-s6ldx\" (UID: \"f8125906-514c-4f62-9b0c-82f218af43f1\") " pod="openshift-monitoring/node-exporter-s6ldx" Apr 24 14:26:30.759206 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:30.759191 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f8125906-514c-4f62-9b0c-82f218af43f1-node-exporter-wtmp\") pod \"node-exporter-s6ldx\" (UID: \"f8125906-514c-4f62-9b0c-82f218af43f1\") " pod="openshift-monitoring/node-exporter-s6ldx" Apr 24 14:26:30.759481 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:30.759206 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f8125906-514c-4f62-9b0c-82f218af43f1-root\") pod \"node-exporter-s6ldx\" (UID: \"f8125906-514c-4f62-9b0c-82f218af43f1\") " pod="openshift-monitoring/node-exporter-s6ldx" Apr 24 14:26:30.759481 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:30.759219 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f8125906-514c-4f62-9b0c-82f218af43f1-node-exporter-accelerators-collector-config\") pod \"node-exporter-s6ldx\" (UID: \"f8125906-514c-4f62-9b0c-82f218af43f1\") " pod="openshift-monitoring/node-exporter-s6ldx" Apr 24 14:26:30.759481 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:26:30.759351 2570 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 14:26:30.759481 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:26:30.759416 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8125906-514c-4f62-9b0c-82f218af43f1-node-exporter-tls podName:f8125906-514c-4f62-9b0c-82f218af43f1 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:31.2593955 +0000 UTC m=+159.385271197 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/f8125906-514c-4f62-9b0c-82f218af43f1-node-exporter-tls") pod "node-exporter-s6ldx" (UID: "f8125906-514c-4f62-9b0c-82f218af43f1") : secret "node-exporter-tls" not found Apr 24 14:26:30.759721 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:30.759581 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f8125906-514c-4f62-9b0c-82f218af43f1-node-exporter-textfile\") pod \"node-exporter-s6ldx\" (UID: \"f8125906-514c-4f62-9b0c-82f218af43f1\") " pod="openshift-monitoring/node-exporter-s6ldx" Apr 24 14:26:30.759808 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:30.759785 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f8125906-514c-4f62-9b0c-82f218af43f1-node-exporter-wtmp\") pod \"node-exporter-s6ldx\" (UID: \"f8125906-514c-4f62-9b0c-82f218af43f1\") " pod="openshift-monitoring/node-exporter-s6ldx" Apr 24 14:26:30.759867 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:30.759843 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f8125906-514c-4f62-9b0c-82f218af43f1-node-exporter-accelerators-collector-config\") pod \"node-exporter-s6ldx\" (UID: \"f8125906-514c-4f62-9b0c-82f218af43f1\") " pod="openshift-monitoring/node-exporter-s6ldx" Apr 24 14:26:30.760811 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:30.760788 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f8125906-514c-4f62-9b0c-82f218af43f1-metrics-client-ca\") pod \"node-exporter-s6ldx\" (UID: \"f8125906-514c-4f62-9b0c-82f218af43f1\") " pod="openshift-monitoring/node-exporter-s6ldx" Apr 24 14:26:30.762405 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:30.761857 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f8125906-514c-4f62-9b0c-82f218af43f1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-s6ldx\" (UID: \"f8125906-514c-4f62-9b0c-82f218af43f1\") " pod="openshift-monitoring/node-exporter-s6ldx" Apr 24 14:26:30.771725 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:30.771698 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzmv7\" (UniqueName: \"kubernetes.io/projected/f8125906-514c-4f62-9b0c-82f218af43f1-kube-api-access-bzmv7\") pod \"node-exporter-s6ldx\" (UID: \"f8125906-514c-4f62-9b0c-82f218af43f1\") " pod="openshift-monitoring/node-exporter-s6ldx" Apr 24 14:26:31.262795 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:31.262755 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f8125906-514c-4f62-9b0c-82f218af43f1-node-exporter-tls\") pod \"node-exporter-s6ldx\" (UID: \"f8125906-514c-4f62-9b0c-82f218af43f1\") " pod="openshift-monitoring/node-exporter-s6ldx" Apr 24 14:26:31.265308 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:31.265278 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f8125906-514c-4f62-9b0c-82f218af43f1-node-exporter-tls\") pod \"node-exporter-s6ldx\" (UID: \"f8125906-514c-4f62-9b0c-82f218af43f1\") " pod="openshift-monitoring/node-exporter-s6ldx" Apr 24 14:26:31.462877 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:31.462842 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-s6ldx" Apr 24 14:26:31.471639 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:26:31.471599 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8125906_514c_4f62_9b0c_82f218af43f1.slice/crio-3ec25b390d96407ec800ed2e1c8b80bb6a50b149ce2740dd6fed93b4e9dd0987 WatchSource:0}: Error finding container 3ec25b390d96407ec800ed2e1c8b80bb6a50b149ce2740dd6fed93b4e9dd0987: Status 404 returned error can't find the container with id 3ec25b390d96407ec800ed2e1c8b80bb6a50b149ce2740dd6fed93b4e9dd0987 Apr 24 14:26:31.851523 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:31.851482 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-s6ldx" event={"ID":"f8125906-514c-4f62-9b0c-82f218af43f1","Type":"ContainerStarted","Data":"3ec25b390d96407ec800ed2e1c8b80bb6a50b149ce2740dd6fed93b4e9dd0987"} Apr 24 14:26:32.518615 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:32.518533 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-b99977797-jpktx"] Apr 24 14:26:32.521518 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:32.521501 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-b99977797-jpktx" Apr 24 14:26:32.524328 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:32.524300 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 24 14:26:32.524464 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:32.524331 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 24 14:26:32.524464 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:32.524338 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 24 14:26:32.524464 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:32.524303 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 24 14:26:32.524464 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:32.524354 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-7pb9a9nhhjv0m\"" Apr 24 14:26:32.524786 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:32.524770 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 24 14:26:32.525534 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:32.525518 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-757pn\"" Apr 24 14:26:32.531873 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:32.531853 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-b99977797-jpktx"] Apr 24 14:26:32.573122 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:32.573092 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/4b0b822c-db44-478f-a5c4-349987078d8c-secret-thanos-querier-tls\") pod \"thanos-querier-b99977797-jpktx\" (UID: \"4b0b822c-db44-478f-a5c4-349987078d8c\") " pod="openshift-monitoring/thanos-querier-b99977797-jpktx" Apr 24 14:26:32.573253 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:32.573169 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4b0b822c-db44-478f-a5c4-349987078d8c-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-b99977797-jpktx\" (UID: \"4b0b822c-db44-478f-a5c4-349987078d8c\") " pod="openshift-monitoring/thanos-querier-b99977797-jpktx" Apr 24 14:26:32.573253 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:32.573214 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfnws\" (UniqueName: \"kubernetes.io/projected/4b0b822c-db44-478f-a5c4-349987078d8c-kube-api-access-kfnws\") pod \"thanos-querier-b99977797-jpktx\" (UID: \"4b0b822c-db44-478f-a5c4-349987078d8c\") " pod="openshift-monitoring/thanos-querier-b99977797-jpktx" Apr 24 14:26:32.573372 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:32.573257 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4b0b822c-db44-478f-a5c4-349987078d8c-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-b99977797-jpktx\" (UID: \"4b0b822c-db44-478f-a5c4-349987078d8c\") " pod="openshift-monitoring/thanos-querier-b99977797-jpktx" Apr 24 14:26:32.573372 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:32.573295 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/4b0b822c-db44-478f-a5c4-349987078d8c-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-b99977797-jpktx\" (UID: \"4b0b822c-db44-478f-a5c4-349987078d8c\") " pod="openshift-monitoring/thanos-querier-b99977797-jpktx" Apr 24 14:26:32.573372 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:32.573363 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4b0b822c-db44-478f-a5c4-349987078d8c-metrics-client-ca\") pod \"thanos-querier-b99977797-jpktx\" (UID: \"4b0b822c-db44-478f-a5c4-349987078d8c\") " pod="openshift-monitoring/thanos-querier-b99977797-jpktx" Apr 24 14:26:32.573509 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:32.573390 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4b0b822c-db44-478f-a5c4-349987078d8c-secret-grpc-tls\") pod \"thanos-querier-b99977797-jpktx\" (UID: \"4b0b822c-db44-478f-a5c4-349987078d8c\") " pod="openshift-monitoring/thanos-querier-b99977797-jpktx" Apr 24 14:26:32.573509 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:32.573422 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/4b0b822c-db44-478f-a5c4-349987078d8c-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-b99977797-jpktx\" (UID: \"4b0b822c-db44-478f-a5c4-349987078d8c\") " pod="openshift-monitoring/thanos-querier-b99977797-jpktx" Apr 24 14:26:32.674542 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:32.674510 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4b0b822c-db44-478f-a5c4-349987078d8c-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-b99977797-jpktx\" (UID: \"4b0b822c-db44-478f-a5c4-349987078d8c\") " pod="openshift-monitoring/thanos-querier-b99977797-jpktx" Apr 24 14:26:32.674725 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:32.674551 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/4b0b822c-db44-478f-a5c4-349987078d8c-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-b99977797-jpktx\" (UID: \"4b0b822c-db44-478f-a5c4-349987078d8c\") " pod="openshift-monitoring/thanos-querier-b99977797-jpktx" Apr 24 14:26:32.674725 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:32.674680 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4b0b822c-db44-478f-a5c4-349987078d8c-metrics-client-ca\") pod \"thanos-querier-b99977797-jpktx\" (UID: \"4b0b822c-db44-478f-a5c4-349987078d8c\") " pod="openshift-monitoring/thanos-querier-b99977797-jpktx" Apr 24 14:26:32.674725 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:32.674719 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4b0b822c-db44-478f-a5c4-349987078d8c-secret-grpc-tls\") pod \"thanos-querier-b99977797-jpktx\" (UID: \"4b0b822c-db44-478f-a5c4-349987078d8c\") " pod="openshift-monitoring/thanos-querier-b99977797-jpktx" Apr 24 14:26:32.674882 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:32.674750 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/4b0b822c-db44-478f-a5c4-349987078d8c-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-b99977797-jpktx\" (UID: \"4b0b822c-db44-478f-a5c4-349987078d8c\") " pod="openshift-monitoring/thanos-querier-b99977797-jpktx" Apr 24 14:26:32.674882 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:32.674792 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/4b0b822c-db44-478f-a5c4-349987078d8c-secret-thanos-querier-tls\") pod \"thanos-querier-b99977797-jpktx\" (UID: \"4b0b822c-db44-478f-a5c4-349987078d8c\") " pod="openshift-monitoring/thanos-querier-b99977797-jpktx" Apr 24 14:26:32.674982 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:32.674881 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4b0b822c-db44-478f-a5c4-349987078d8c-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-b99977797-jpktx\" (UID: \"4b0b822c-db44-478f-a5c4-349987078d8c\") " pod="openshift-monitoring/thanos-querier-b99977797-jpktx" Apr 24 14:26:32.674982 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:32.674935 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kfnws\" (UniqueName: \"kubernetes.io/projected/4b0b822c-db44-478f-a5c4-349987078d8c-kube-api-access-kfnws\") pod \"thanos-querier-b99977797-jpktx\" (UID: \"4b0b822c-db44-478f-a5c4-349987078d8c\") " pod="openshift-monitoring/thanos-querier-b99977797-jpktx" Apr 24 14:26:32.675712 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:32.675663 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4b0b822c-db44-478f-a5c4-349987078d8c-metrics-client-ca\") pod \"thanos-querier-b99977797-jpktx\" (UID: \"4b0b822c-db44-478f-a5c4-349987078d8c\") " pod="openshift-monitoring/thanos-querier-b99977797-jpktx" Apr 24 14:26:32.677420 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:32.677365 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4b0b822c-db44-478f-a5c4-349987078d8c-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-b99977797-jpktx\" (UID: \"4b0b822c-db44-478f-a5c4-349987078d8c\") " pod="openshift-monitoring/thanos-querier-b99977797-jpktx" Apr 24 14:26:32.677537 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:32.677470 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4b0b822c-db44-478f-a5c4-349987078d8c-secret-grpc-tls\") pod \"thanos-querier-b99977797-jpktx\" (UID: \"4b0b822c-db44-478f-a5c4-349987078d8c\") " pod="openshift-monitoring/thanos-querier-b99977797-jpktx" Apr 24 14:26:32.677848 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:32.677825 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/4b0b822c-db44-478f-a5c4-349987078d8c-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-b99977797-jpktx\" (UID: \"4b0b822c-db44-478f-a5c4-349987078d8c\") " pod="openshift-monitoring/thanos-querier-b99977797-jpktx" Apr 24 14:26:32.677921 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:32.677831 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/4b0b822c-db44-478f-a5c4-349987078d8c-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-b99977797-jpktx\" (UID: \"4b0b822c-db44-478f-a5c4-349987078d8c\") " pod="openshift-monitoring/thanos-querier-b99977797-jpktx" Apr 24 14:26:32.677921 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:32.677874 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4b0b822c-db44-478f-a5c4-349987078d8c-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-b99977797-jpktx\" (UID: \"4b0b822c-db44-478f-a5c4-349987078d8c\") " pod="openshift-monitoring/thanos-querier-b99977797-jpktx" Apr 24 14:26:32.677921 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:32.677916 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/4b0b822c-db44-478f-a5c4-349987078d8c-secret-thanos-querier-tls\") pod \"thanos-querier-b99977797-jpktx\" (UID: \"4b0b822c-db44-478f-a5c4-349987078d8c\") " pod="openshift-monitoring/thanos-querier-b99977797-jpktx" Apr 24 14:26:32.684243 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:32.684221 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfnws\" (UniqueName: \"kubernetes.io/projected/4b0b822c-db44-478f-a5c4-349987078d8c-kube-api-access-kfnws\") pod \"thanos-querier-b99977797-jpktx\" (UID: \"4b0b822c-db44-478f-a5c4-349987078d8c\") " pod="openshift-monitoring/thanos-querier-b99977797-jpktx" Apr 24 14:26:32.831363 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:32.831281 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-b99977797-jpktx" Apr 24 14:26:32.855450 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:32.855411 2570 generic.go:358] "Generic (PLEG): container finished" podID="f8125906-514c-4f62-9b0c-82f218af43f1" containerID="180248f3a01c3890610dbe9789b182c5f9ca4bf1ec26043ad2abc2e24ec5f59b" exitCode=0 Apr 24 14:26:32.855594 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:32.855454 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-s6ldx" event={"ID":"f8125906-514c-4f62-9b0c-82f218af43f1","Type":"ContainerDied","Data":"180248f3a01c3890610dbe9789b182c5f9ca4bf1ec26043ad2abc2e24ec5f59b"} Apr 24 14:26:32.961155 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:32.961131 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-b99977797-jpktx"] Apr 24 14:26:32.964956 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:26:32.964931 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b0b822c_db44_478f_a5c4_349987078d8c.slice/crio-64e7e01c171bc488239fffaf2b78dcef2810e6edb580279a82478900dcec1527 WatchSource:0}: Error finding container 64e7e01c171bc488239fffaf2b78dcef2810e6edb580279a82478900dcec1527: Status 404 returned error can't find the container with id 64e7e01c171bc488239fffaf2b78dcef2810e6edb580279a82478900dcec1527 Apr 24 14:26:33.788276 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:33.788239 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e-metrics-tls\") pod \"dns-default-hs6xt\" (UID: \"2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e\") " pod="openshift-dns/dns-default-hs6xt" Apr 24 14:26:33.788669 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:33.788283 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ab7e98f-94b0-4d9b-9d46-6666f05fb86a-cert\") pod \"ingress-canary-lkncg\" (UID: \"7ab7e98f-94b0-4d9b-9d46-6666f05fb86a\") " pod="openshift-ingress-canary/ingress-canary-lkncg" Apr 24 14:26:33.790704 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:33.790674 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e-metrics-tls\") pod \"dns-default-hs6xt\" (UID: \"2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e\") " pod="openshift-dns/dns-default-hs6xt" Apr 24 14:26:33.790816 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:33.790772 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ab7e98f-94b0-4d9b-9d46-6666f05fb86a-cert\") pod \"ingress-canary-lkncg\" (UID: \"7ab7e98f-94b0-4d9b-9d46-6666f05fb86a\") " pod="openshift-ingress-canary/ingress-canary-lkncg" Apr 24 14:26:33.860588 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:33.860545 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-s6ldx" event={"ID":"f8125906-514c-4f62-9b0c-82f218af43f1","Type":"ContainerStarted","Data":"d5733be702860c314871fa1957a1f610e99f9edfa312916662c7576833c1f7ba"} Apr 24 14:26:33.860588 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:33.860591 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-s6ldx" event={"ID":"f8125906-514c-4f62-9b0c-82f218af43f1","Type":"ContainerStarted","Data":"a0d0ce75af280788707d1278363e5f7669c89e3f23fc91d9c184da987b09b989"} Apr 24 14:26:33.861802 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:33.861768 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b99977797-jpktx" event={"ID":"4b0b822c-db44-478f-a5c4-349987078d8c","Type":"ContainerStarted","Data":"64e7e01c171bc488239fffaf2b78dcef2810e6edb580279a82478900dcec1527"} Apr 24 14:26:33.884224 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:33.884171 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-s6ldx" podStartSLOduration=3.119413678 podStartE2EDuration="3.884158293s" podCreationTimestamp="2026-04-24 14:26:30 +0000 UTC" firstStartedPulling="2026-04-24 14:26:31.473434515 +0000 UTC m=+159.599310197" lastFinishedPulling="2026-04-24 14:26:32.238179126 +0000 UTC m=+160.364054812" observedRunningTime="2026-04-24 14:26:33.883401691 +0000 UTC m=+162.009277397" watchObservedRunningTime="2026-04-24 14:26:33.884158293 +0000 UTC m=+162.010033998" Apr 24 14:26:33.945303 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:33.945274 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-m8ml5\"" Apr 24 14:26:33.953696 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:33.953672 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hs6xt" Apr 24 14:26:34.086756 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:34.086732 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hs6xt"] Apr 24 14:26:34.089584 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:26:34.089533 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d92e4ce_51ba_4d8c_ae8e_bd3994ddf63e.slice/crio-08426fe31547b4df6c3b26a38d1219a88821cc1110498c452903dd95246c274f WatchSource:0}: Error finding container 08426fe31547b4df6c3b26a38d1219a88821cc1110498c452903dd95246c274f: Status 404 returned error can't find the container with id 08426fe31547b4df6c3b26a38d1219a88821cc1110498c452903dd95246c274f Apr 24 14:26:34.865254 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:34.865138 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hs6xt" event={"ID":"2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e","Type":"ContainerStarted","Data":"08426fe31547b4df6c3b26a38d1219a88821cc1110498c452903dd95246c274f"} Apr 24 14:26:34.887111 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:34.887087 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-585c75f8cc-9sn69"] Apr 24 14:26:34.890410 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:34.890390 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-585c75f8cc-9sn69" Apr 24 14:26:34.893234 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:34.893213 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 24 14:26:34.893234 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:34.893227 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 14:26:34.894731 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:34.894562 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 24 14:26:34.894731 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:34.894578 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-b9te93b44f91j\"" Apr 24 14:26:34.894731 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:34.894600 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-mdnx7\"" Apr 24 14:26:34.894731 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:34.894650 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 24 14:26:34.899812 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:34.899788 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-585c75f8cc-9sn69"] Apr 24 14:26:34.997231 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:34.997106 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/38cf486e-f496-4330-87d5-dbbe5a60b7ce-metrics-server-audit-profiles\") pod \"metrics-server-585c75f8cc-9sn69\" (UID: \"38cf486e-f496-4330-87d5-dbbe5a60b7ce\") " pod="openshift-monitoring/metrics-server-585c75f8cc-9sn69" Apr 24 14:26:34.997231 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:34.997164 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38cf486e-f496-4330-87d5-dbbe5a60b7ce-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-585c75f8cc-9sn69\" (UID: \"38cf486e-f496-4330-87d5-dbbe5a60b7ce\") " pod="openshift-monitoring/metrics-server-585c75f8cc-9sn69" Apr 24 14:26:34.997231 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:34.997223 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/38cf486e-f496-4330-87d5-dbbe5a60b7ce-audit-log\") pod \"metrics-server-585c75f8cc-9sn69\" (UID: \"38cf486e-f496-4330-87d5-dbbe5a60b7ce\") " pod="openshift-monitoring/metrics-server-585c75f8cc-9sn69" Apr 24 14:26:34.997389 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:34.997322 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38cf486e-f496-4330-87d5-dbbe5a60b7ce-client-ca-bundle\") pod \"metrics-server-585c75f8cc-9sn69\" (UID: \"38cf486e-f496-4330-87d5-dbbe5a60b7ce\") " pod="openshift-monitoring/metrics-server-585c75f8cc-9sn69" Apr 24 14:26:34.997389 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:34.997349 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/38cf486e-f496-4330-87d5-dbbe5a60b7ce-secret-metrics-server-client-certs\") pod \"metrics-server-585c75f8cc-9sn69\" (UID: \"38cf486e-f496-4330-87d5-dbbe5a60b7ce\") " pod="openshift-monitoring/metrics-server-585c75f8cc-9sn69" Apr 24 14:26:34.997486 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:34.997396 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/38cf486e-f496-4330-87d5-dbbe5a60b7ce-secret-metrics-server-tls\") pod \"metrics-server-585c75f8cc-9sn69\" (UID: \"38cf486e-f496-4330-87d5-dbbe5a60b7ce\") " pod="openshift-monitoring/metrics-server-585c75f8cc-9sn69" Apr 24 14:26:34.997486 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:34.997417 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqvkn\" (UniqueName: \"kubernetes.io/projected/38cf486e-f496-4330-87d5-dbbe5a60b7ce-kube-api-access-hqvkn\") pod \"metrics-server-585c75f8cc-9sn69\" (UID: \"38cf486e-f496-4330-87d5-dbbe5a60b7ce\") " pod="openshift-monitoring/metrics-server-585c75f8cc-9sn69" Apr 24 14:26:35.098966 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.098527 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/38cf486e-f496-4330-87d5-dbbe5a60b7ce-secret-metrics-server-tls\") pod \"metrics-server-585c75f8cc-9sn69\" (UID: \"38cf486e-f496-4330-87d5-dbbe5a60b7ce\") " pod="openshift-monitoring/metrics-server-585c75f8cc-9sn69" Apr 24 14:26:35.098966 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.098575 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqvkn\" (UniqueName: \"kubernetes.io/projected/38cf486e-f496-4330-87d5-dbbe5a60b7ce-kube-api-access-hqvkn\") pod \"metrics-server-585c75f8cc-9sn69\" (UID: \"38cf486e-f496-4330-87d5-dbbe5a60b7ce\") " pod="openshift-monitoring/metrics-server-585c75f8cc-9sn69" Apr 24 14:26:35.098966 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.098655 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/38cf486e-f496-4330-87d5-dbbe5a60b7ce-metrics-server-audit-profiles\") pod \"metrics-server-585c75f8cc-9sn69\" (UID: \"38cf486e-f496-4330-87d5-dbbe5a60b7ce\") " pod="openshift-monitoring/metrics-server-585c75f8cc-9sn69" Apr 24 14:26:35.098966 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.098698 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38cf486e-f496-4330-87d5-dbbe5a60b7ce-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-585c75f8cc-9sn69\" (UID: \"38cf486e-f496-4330-87d5-dbbe5a60b7ce\") " pod="openshift-monitoring/metrics-server-585c75f8cc-9sn69" Apr 24 14:26:35.098966 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.098731 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/38cf486e-f496-4330-87d5-dbbe5a60b7ce-audit-log\") pod \"metrics-server-585c75f8cc-9sn69\" (UID: \"38cf486e-f496-4330-87d5-dbbe5a60b7ce\") " pod="openshift-monitoring/metrics-server-585c75f8cc-9sn69" Apr 24 14:26:35.098966 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.098785 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38cf486e-f496-4330-87d5-dbbe5a60b7ce-client-ca-bundle\") pod \"metrics-server-585c75f8cc-9sn69\" (UID: \"38cf486e-f496-4330-87d5-dbbe5a60b7ce\") " pod="openshift-monitoring/metrics-server-585c75f8cc-9sn69" Apr 24 14:26:35.098966 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.098811 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/38cf486e-f496-4330-87d5-dbbe5a60b7ce-secret-metrics-server-client-certs\") pod \"metrics-server-585c75f8cc-9sn69\" (UID: \"38cf486e-f496-4330-87d5-dbbe5a60b7ce\") " pod="openshift-monitoring/metrics-server-585c75f8cc-9sn69" Apr 24 14:26:35.099409 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.099382 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/38cf486e-f496-4330-87d5-dbbe5a60b7ce-audit-log\") pod \"metrics-server-585c75f8cc-9sn69\" (UID: \"38cf486e-f496-4330-87d5-dbbe5a60b7ce\") " pod="openshift-monitoring/metrics-server-585c75f8cc-9sn69" Apr 24 14:26:35.099773 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.099750 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38cf486e-f496-4330-87d5-dbbe5a60b7ce-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-585c75f8cc-9sn69\" (UID: \"38cf486e-f496-4330-87d5-dbbe5a60b7ce\") " pod="openshift-monitoring/metrics-server-585c75f8cc-9sn69" Apr 24 14:26:35.100455 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.100409 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/38cf486e-f496-4330-87d5-dbbe5a60b7ce-metrics-server-audit-profiles\") pod \"metrics-server-585c75f8cc-9sn69\" (UID: \"38cf486e-f496-4330-87d5-dbbe5a60b7ce\") " pod="openshift-monitoring/metrics-server-585c75f8cc-9sn69" Apr 24 14:26:35.102129 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.102096 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38cf486e-f496-4330-87d5-dbbe5a60b7ce-client-ca-bundle\") pod \"metrics-server-585c75f8cc-9sn69\" (UID: \"38cf486e-f496-4330-87d5-dbbe5a60b7ce\") " pod="openshift-monitoring/metrics-server-585c75f8cc-9sn69" Apr 24 14:26:35.102193 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.102162 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/38cf486e-f496-4330-87d5-dbbe5a60b7ce-secret-metrics-server-client-certs\") pod \"metrics-server-585c75f8cc-9sn69\" (UID: \"38cf486e-f496-4330-87d5-dbbe5a60b7ce\") " pod="openshift-monitoring/metrics-server-585c75f8cc-9sn69" Apr 24 14:26:35.102865 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.102811 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/38cf486e-f496-4330-87d5-dbbe5a60b7ce-secret-metrics-server-tls\") pod \"metrics-server-585c75f8cc-9sn69\" (UID: \"38cf486e-f496-4330-87d5-dbbe5a60b7ce\") " pod="openshift-monitoring/metrics-server-585c75f8cc-9sn69" Apr 24 14:26:35.108443 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.108399 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqvkn\" (UniqueName: \"kubernetes.io/projected/38cf486e-f496-4330-87d5-dbbe5a60b7ce-kube-api-access-hqvkn\") pod \"metrics-server-585c75f8cc-9sn69\" (UID: \"38cf486e-f496-4330-87d5-dbbe5a60b7ce\") " pod="openshift-monitoring/metrics-server-585c75f8cc-9sn69" Apr 24 14:26:35.216987 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.216839 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-585c75f8cc-9sn69" Apr 24 14:26:35.364148 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.364090 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-585c75f8cc-9sn69"] Apr 24 14:26:35.368332 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:26:35.368305 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38cf486e_f496_4330_87d5_dbbe5a60b7ce.slice/crio-536f183ae8d066229c19af093266646c0ee47c464ef4c3effbb9b617f0c4db3e WatchSource:0}: Error finding container 536f183ae8d066229c19af093266646c0ee47c464ef4c3effbb9b617f0c4db3e: Status 404 returned error can't find the container with id 536f183ae8d066229c19af093266646c0ee47c464ef4c3effbb9b617f0c4db3e Apr 24 14:26:35.775710 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.775231 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-b9bf7858d-dwf2g"] Apr 24 14:26:35.778536 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.778519 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-b9bf7858d-dwf2g" Apr 24 14:26:35.781251 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.781177 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 24 14:26:35.781251 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.781241 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-d2nj7\"" Apr 24 14:26:35.781442 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.781246 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 24 14:26:35.781442 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.781244 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 24 14:26:35.781442 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.781247 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 24 14:26:35.781676 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.781659 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 24 14:26:35.786585 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.786563 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 24 14:26:35.796209 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.796186 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-b9bf7858d-dwf2g"] Apr 24 14:26:35.806100 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.806078 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bc1a78af-2e01-47bb-b73e-143d3e005ea0-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-b9bf7858d-dwf2g\" (UID: \"bc1a78af-2e01-47bb-b73e-143d3e005ea0\") " pod="openshift-monitoring/telemeter-client-b9bf7858d-dwf2g" Apr 24 14:26:35.806186 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.806111 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/bc1a78af-2e01-47bb-b73e-143d3e005ea0-secret-telemeter-client\") pod \"telemeter-client-b9bf7858d-dwf2g\" (UID: \"bc1a78af-2e01-47bb-b73e-143d3e005ea0\") " pod="openshift-monitoring/telemeter-client-b9bf7858d-dwf2g" Apr 24 14:26:35.806221 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.806178 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc1a78af-2e01-47bb-b73e-143d3e005ea0-telemeter-trusted-ca-bundle\") pod \"telemeter-client-b9bf7858d-dwf2g\" (UID: \"bc1a78af-2e01-47bb-b73e-143d3e005ea0\") " pod="openshift-monitoring/telemeter-client-b9bf7858d-dwf2g" Apr 24 14:26:35.806263 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.806240 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bc1a78af-2e01-47bb-b73e-143d3e005ea0-metrics-client-ca\") pod \"telemeter-client-b9bf7858d-dwf2g\" (UID: \"bc1a78af-2e01-47bb-b73e-143d3e005ea0\") " pod="openshift-monitoring/telemeter-client-b9bf7858d-dwf2g" Apr 24 14:26:35.806299 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.806282 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/bc1a78af-2e01-47bb-b73e-143d3e005ea0-telemeter-client-tls\") pod \"telemeter-client-b9bf7858d-dwf2g\" (UID: \"bc1a78af-2e01-47bb-b73e-143d3e005ea0\") " pod="openshift-monitoring/telemeter-client-b9bf7858d-dwf2g" Apr 24 14:26:35.806331 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.806304 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/bc1a78af-2e01-47bb-b73e-143d3e005ea0-federate-client-tls\") pod \"telemeter-client-b9bf7858d-dwf2g\" (UID: \"bc1a78af-2e01-47bb-b73e-143d3e005ea0\") " pod="openshift-monitoring/telemeter-client-b9bf7858d-dwf2g" Apr 24 14:26:35.806331 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.806323 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w4cc\" (UniqueName: \"kubernetes.io/projected/bc1a78af-2e01-47bb-b73e-143d3e005ea0-kube-api-access-6w4cc\") pod \"telemeter-client-b9bf7858d-dwf2g\" (UID: \"bc1a78af-2e01-47bb-b73e-143d3e005ea0\") " pod="openshift-monitoring/telemeter-client-b9bf7858d-dwf2g" Apr 24 14:26:35.806393 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.806350 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc1a78af-2e01-47bb-b73e-143d3e005ea0-serving-certs-ca-bundle\") pod \"telemeter-client-b9bf7858d-dwf2g\" (UID: \"bc1a78af-2e01-47bb-b73e-143d3e005ea0\") " pod="openshift-monitoring/telemeter-client-b9bf7858d-dwf2g" Apr 24 14:26:35.870248 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.870213 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b99977797-jpktx" event={"ID":"4b0b822c-db44-478f-a5c4-349987078d8c","Type":"ContainerStarted","Data":"ba7692c2a4782d390b3c0ff2ee9d4ed06f99c3021083d7ba338519135734779a"} Apr 24 14:26:35.870698 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.870258 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b99977797-jpktx" event={"ID":"4b0b822c-db44-478f-a5c4-349987078d8c","Type":"ContainerStarted","Data":"177544b2ce9c830a06635c8587186b430b4d9c9b3c3b50368a9b5b65409e18ec"} Apr 24 14:26:35.870698 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.870274 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b99977797-jpktx" event={"ID":"4b0b822c-db44-478f-a5c4-349987078d8c","Type":"ContainerStarted","Data":"39efb3c2153b0601a36bedd037361b7f7fc6cbfd4b07038e255b6d33a963b03f"} Apr 24 14:26:35.871374 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.871335 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-585c75f8cc-9sn69" event={"ID":"38cf486e-f496-4330-87d5-dbbe5a60b7ce","Type":"ContainerStarted","Data":"536f183ae8d066229c19af093266646c0ee47c464ef4c3effbb9b617f0c4db3e"} Apr 24 14:26:35.907543 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.907518 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc1a78af-2e01-47bb-b73e-143d3e005ea0-serving-certs-ca-bundle\") pod \"telemeter-client-b9bf7858d-dwf2g\" (UID: \"bc1a78af-2e01-47bb-b73e-143d3e005ea0\") " pod="openshift-monitoring/telemeter-client-b9bf7858d-dwf2g" Apr 24 14:26:35.907670 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.907562 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bc1a78af-2e01-47bb-b73e-143d3e005ea0-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-b9bf7858d-dwf2g\" (UID: \"bc1a78af-2e01-47bb-b73e-143d3e005ea0\") " pod="openshift-monitoring/telemeter-client-b9bf7858d-dwf2g" Apr 24 14:26:35.907670 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.907593 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/bc1a78af-2e01-47bb-b73e-143d3e005ea0-secret-telemeter-client\") pod \"telemeter-client-b9bf7858d-dwf2g\" (UID: \"bc1a78af-2e01-47bb-b73e-143d3e005ea0\") " pod="openshift-monitoring/telemeter-client-b9bf7858d-dwf2g" Apr 24 14:26:35.907782 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.907667 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc1a78af-2e01-47bb-b73e-143d3e005ea0-telemeter-trusted-ca-bundle\") pod \"telemeter-client-b9bf7858d-dwf2g\" (UID: \"bc1a78af-2e01-47bb-b73e-143d3e005ea0\") " pod="openshift-monitoring/telemeter-client-b9bf7858d-dwf2g" Apr 24 14:26:35.907782 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.907724 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bc1a78af-2e01-47bb-b73e-143d3e005ea0-metrics-client-ca\") pod \"telemeter-client-b9bf7858d-dwf2g\" (UID: \"bc1a78af-2e01-47bb-b73e-143d3e005ea0\") " pod="openshift-monitoring/telemeter-client-b9bf7858d-dwf2g" Apr 24 14:26:35.907782 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.907767 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/bc1a78af-2e01-47bb-b73e-143d3e005ea0-telemeter-client-tls\") pod \"telemeter-client-b9bf7858d-dwf2g\" (UID: \"bc1a78af-2e01-47bb-b73e-143d3e005ea0\") " pod="openshift-monitoring/telemeter-client-b9bf7858d-dwf2g" Apr 24 14:26:35.907927 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.907793 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/bc1a78af-2e01-47bb-b73e-143d3e005ea0-federate-client-tls\") pod \"telemeter-client-b9bf7858d-dwf2g\" (UID: \"bc1a78af-2e01-47bb-b73e-143d3e005ea0\") " pod="openshift-monitoring/telemeter-client-b9bf7858d-dwf2g" Apr 24 14:26:35.907927 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.907823 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6w4cc\" (UniqueName: \"kubernetes.io/projected/bc1a78af-2e01-47bb-b73e-143d3e005ea0-kube-api-access-6w4cc\") pod \"telemeter-client-b9bf7858d-dwf2g\" (UID: \"bc1a78af-2e01-47bb-b73e-143d3e005ea0\") " pod="openshift-monitoring/telemeter-client-b9bf7858d-dwf2g" Apr 24 14:26:35.908571 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.908547 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc1a78af-2e01-47bb-b73e-143d3e005ea0-telemeter-trusted-ca-bundle\") pod \"telemeter-client-b9bf7858d-dwf2g\" (UID: \"bc1a78af-2e01-47bb-b73e-143d3e005ea0\") " pod="openshift-monitoring/telemeter-client-b9bf7858d-dwf2g" Apr 24 14:26:35.908755 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.908731 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bc1a78af-2e01-47bb-b73e-143d3e005ea0-metrics-client-ca\") pod \"telemeter-client-b9bf7858d-dwf2g\" (UID: \"bc1a78af-2e01-47bb-b73e-143d3e005ea0\") " pod="openshift-monitoring/telemeter-client-b9bf7858d-dwf2g" Apr 24 14:26:35.908903 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.908882 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc1a78af-2e01-47bb-b73e-143d3e005ea0-serving-certs-ca-bundle\") pod \"telemeter-client-b9bf7858d-dwf2g\" (UID: \"bc1a78af-2e01-47bb-b73e-143d3e005ea0\") " pod="openshift-monitoring/telemeter-client-b9bf7858d-dwf2g" Apr 24 14:26:35.910247 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.910224 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/bc1a78af-2e01-47bb-b73e-143d3e005ea0-telemeter-client-tls\") pod \"telemeter-client-b9bf7858d-dwf2g\" (UID: \"bc1a78af-2e01-47bb-b73e-143d3e005ea0\") " pod="openshift-monitoring/telemeter-client-b9bf7858d-dwf2g" Apr 24 14:26:35.910325 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.910243 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/bc1a78af-2e01-47bb-b73e-143d3e005ea0-secret-telemeter-client\") pod \"telemeter-client-b9bf7858d-dwf2g\" (UID: \"bc1a78af-2e01-47bb-b73e-143d3e005ea0\") " pod="openshift-monitoring/telemeter-client-b9bf7858d-dwf2g" Apr 24 14:26:35.910325 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.910288 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bc1a78af-2e01-47bb-b73e-143d3e005ea0-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-b9bf7858d-dwf2g\" (UID: \"bc1a78af-2e01-47bb-b73e-143d3e005ea0\") " pod="openshift-monitoring/telemeter-client-b9bf7858d-dwf2g" Apr 24 14:26:35.911395 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.911361 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/bc1a78af-2e01-47bb-b73e-143d3e005ea0-federate-client-tls\") pod \"telemeter-client-b9bf7858d-dwf2g\" (UID: \"bc1a78af-2e01-47bb-b73e-143d3e005ea0\") " pod="openshift-monitoring/telemeter-client-b9bf7858d-dwf2g" Apr 24 14:26:35.917015 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:35.916987 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w4cc\" (UniqueName: \"kubernetes.io/projected/bc1a78af-2e01-47bb-b73e-143d3e005ea0-kube-api-access-6w4cc\") pod \"telemeter-client-b9bf7858d-dwf2g\" (UID: \"bc1a78af-2e01-47bb-b73e-143d3e005ea0\") " pod="openshift-monitoring/telemeter-client-b9bf7858d-dwf2g" Apr 24 14:26:36.088874 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.088833 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-b9bf7858d-dwf2g" Apr 24 14:26:36.239664 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.239615 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-b9bf7858d-dwf2g"] Apr 24 14:26:36.290318 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:26:36.290254 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc1a78af_2e01_47bb_b73e_143d3e005ea0.slice/crio-449f84488298865b2b564b6f3f7f9ff9438d209c7c3c871e1d03388c6a93202f WatchSource:0}: Error finding container 449f84488298865b2b564b6f3f7f9ff9438d209c7c3c871e1d03388c6a93202f: Status 404 returned error can't find the container with id 449f84488298865b2b564b6f3f7f9ff9438d209c7c3c871e1d03388c6a93202f Apr 24 14:26:36.726524 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.726488 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 14:26:36.730388 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.730364 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.733135 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.733115 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 14:26:36.733828 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.733393 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-dzlmh\"" Apr 24 14:26:36.733828 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.733463 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 14:26:36.733828 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.733525 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 14:26:36.733828 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.733534 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 14:26:36.733828 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.733554 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 14:26:36.733828 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.733648 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 14:26:36.733828 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.733719 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 14:26:36.733828 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.733722 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-9osrbbpqd7ude\"" Apr 24 14:26:36.733828 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.733742 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 14:26:36.733828 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.733829 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 14:26:36.735888 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.735869 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 14:26:36.736030 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.735870 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 14:26:36.736742 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.736725 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 14:26:36.744335 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.744315 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 14:26:36.815563 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.815535 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-config\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.815750 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.815569 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.815750 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.815665 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.815750 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.815692 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.815750 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.815728 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.815750 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.815749 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/98168d07-827e-4cd1-8fda-2885b13bff36-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.816071 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.815765 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/98168d07-827e-4cd1-8fda-2885b13bff36-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.816071 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.815798 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbpk6\" (UniqueName: \"kubernetes.io/projected/98168d07-827e-4cd1-8fda-2885b13bff36-kube-api-access-fbpk6\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.816071 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.815832 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-web-config\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.816071 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.815849 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/98168d07-827e-4cd1-8fda-2885b13bff36-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.816071 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.815872 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98168d07-827e-4cd1-8fda-2885b13bff36-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.816071 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.815898 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.816071 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.815921 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/98168d07-827e-4cd1-8fda-2885b13bff36-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.816071 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.815985 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.816071 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.816027 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/98168d07-827e-4cd1-8fda-2885b13bff36-config-out\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.816071 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.816057 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.816857 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.816091 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98168d07-827e-4cd1-8fda-2885b13bff36-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.816857 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.816268 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98168d07-827e-4cd1-8fda-2885b13bff36-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.877292 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.877262 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b99977797-jpktx" event={"ID":"4b0b822c-db44-478f-a5c4-349987078d8c","Type":"ContainerStarted","Data":"9bc9951de905bafc43392ed9826e3ca10274860a906b3db8c1526e6ad06e6eca"} Apr 24 14:26:36.877606 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.877302 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b99977797-jpktx" event={"ID":"4b0b822c-db44-478f-a5c4-349987078d8c","Type":"ContainerStarted","Data":"071a96ea85ebed9374dfbceec9c4082c493397d12d3ef1f1f11529ba2ce0dfce"} Apr 24 14:26:36.878865 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.878840 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hs6xt" event={"ID":"2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e","Type":"ContainerStarted","Data":"b3340d9de9cf7b09c4b1c8c6f0bf33aef569c5d78bc573285151718fcee8d972"} Apr 24 14:26:36.878865 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.878872 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hs6xt" event={"ID":"2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e","Type":"ContainerStarted","Data":"94e9a2701c82022b176ed7ba049bdb016a5658f21e727e4763e2836a1b702952"} Apr 24 14:26:36.879034 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.878990 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-hs6xt" Apr 24 14:26:36.879934 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.879915 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-b9bf7858d-dwf2g" event={"ID":"bc1a78af-2e01-47bb-b73e-143d3e005ea0","Type":"ContainerStarted","Data":"449f84488298865b2b564b6f3f7f9ff9438d209c7c3c871e1d03388c6a93202f"} Apr 24 14:26:36.896989 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.896936 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-hs6xt" podStartSLOduration=130.179849426 podStartE2EDuration="2m11.896918001s" podCreationTimestamp="2026-04-24 14:24:25 +0000 UTC" firstStartedPulling="2026-04-24 14:26:34.091487908 +0000 UTC m=+162.217363601" lastFinishedPulling="2026-04-24 14:26:35.808556493 +0000 UTC m=+163.934432176" observedRunningTime="2026-04-24 14:26:36.894954615 +0000 UTC m=+165.020830320" watchObservedRunningTime="2026-04-24 14:26:36.896918001 +0000 UTC m=+165.022793709" Apr 24 14:26:36.917446 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.917417 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fbpk6\" (UniqueName: \"kubernetes.io/projected/98168d07-827e-4cd1-8fda-2885b13bff36-kube-api-access-fbpk6\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.917572 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.917455 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-web-config\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.917572 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.917474 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/98168d07-827e-4cd1-8fda-2885b13bff36-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.917572 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.917491 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98168d07-827e-4cd1-8fda-2885b13bff36-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.917572 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.917513 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.917803 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.917771 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/98168d07-827e-4cd1-8fda-2885b13bff36-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.917874 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.917822 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.917874 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.917857 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/98168d07-827e-4cd1-8fda-2885b13bff36-config-out\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.917965 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.917884 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.917965 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.917918 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98168d07-827e-4cd1-8fda-2885b13bff36-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.918064 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.917970 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98168d07-827e-4cd1-8fda-2885b13bff36-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.918116 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.918065 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-config\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.918116 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.918092 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.918209 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.918154 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.918209 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.918198 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.918304 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.918256 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.918304 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.918287 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/98168d07-827e-4cd1-8fda-2885b13bff36-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.918389 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.918313 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/98168d07-827e-4cd1-8fda-2885b13bff36-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.918523 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.918497 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98168d07-827e-4cd1-8fda-2885b13bff36-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.918729 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.918711 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/98168d07-827e-4cd1-8fda-2885b13bff36-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.920658 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.920537 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/98168d07-827e-4cd1-8fda-2885b13bff36-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.920798 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.920779 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-web-config\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.920895 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.920874 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.920993 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.920971 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-config\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.921057 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.921005 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.921309 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.921283 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/98168d07-827e-4cd1-8fda-2885b13bff36-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.921466 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.921422 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98168d07-827e-4cd1-8fda-2885b13bff36-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.921533 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.921475 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.921764 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.921722 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98168d07-827e-4cd1-8fda-2885b13bff36-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.922387 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.922336 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/98168d07-827e-4cd1-8fda-2885b13bff36-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.923547 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.923525 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.923715 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.923693 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.924209 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.923969 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/98168d07-827e-4cd1-8fda-2885b13bff36-config-out\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.924209 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.924121 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.924209 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.924133 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:36.925752 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:36.925735 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbpk6\" (UniqueName: \"kubernetes.io/projected/98168d07-827e-4cd1-8fda-2885b13bff36-kube-api-access-fbpk6\") pod \"prometheus-k8s-0\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:37.042180 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:37.042090 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:37.198734 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:37.198699 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 14:26:37.203923 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:26:37.203895 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98168d07_827e_4cd1_8fda_2885b13bff36.slice/crio-ede70a465886049635bb8f628fc0a6fde8dc964369e46f39e18300b0d2f1660a WatchSource:0}: Error finding container ede70a465886049635bb8f628fc0a6fde8dc964369e46f39e18300b0d2f1660a: Status 404 returned error can't find the container with id ede70a465886049635bb8f628fc0a6fde8dc964369e46f39e18300b0d2f1660a Apr 24 14:26:37.563682 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:37.563644 2570 patch_prober.go:28] interesting pod/image-registry-7cd9495d59-q64t7 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 14:26:37.563848 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:37.563706 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-7cd9495d59-q64t7" podUID="2a9e2553-3905-4307-bccc-cf5c6355779f" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:26:37.885435 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:37.885400 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b99977797-jpktx" event={"ID":"4b0b822c-db44-478f-a5c4-349987078d8c","Type":"ContainerStarted","Data":"20fe8ce6e7f3f47b6d62f653cae0905d981fb94996fd917e6962fbc9e5cff399"} Apr 24 14:26:37.885915 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:37.885641 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-b99977797-jpktx" Apr 24 14:26:37.887073 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:37.887046 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-585c75f8cc-9sn69" event={"ID":"38cf486e-f496-4330-87d5-dbbe5a60b7ce","Type":"ContainerStarted","Data":"0fd2387de9fbb47515a56cfee0c7c00b9a569db7a1108970df7949184401f38c"} Apr 24 14:26:37.888169 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:37.888142 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"98168d07-827e-4cd1-8fda-2885b13bff36","Type":"ContainerStarted","Data":"ede70a465886049635bb8f628fc0a6fde8dc964369e46f39e18300b0d2f1660a"} Apr 24 14:26:37.906841 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:37.906796 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-b99977797-jpktx" podStartSLOduration=2.5365509299999998 podStartE2EDuration="5.906781787s" podCreationTimestamp="2026-04-24 14:26:32 +0000 UTC" firstStartedPulling="2026-04-24 14:26:32.966932938 +0000 UTC m=+161.092808622" lastFinishedPulling="2026-04-24 14:26:36.337163793 +0000 UTC m=+164.463039479" observedRunningTime="2026-04-24 14:26:37.905607875 +0000 UTC m=+166.031483596" watchObservedRunningTime="2026-04-24 14:26:37.906781787 +0000 UTC m=+166.032657493" Apr 24 14:26:37.921219 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:37.921166 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-585c75f8cc-9sn69" podStartSLOduration=2.425815218 podStartE2EDuration="3.921153683s" podCreationTimestamp="2026-04-24 14:26:34 +0000 UTC" firstStartedPulling="2026-04-24 14:26:35.371004165 +0000 UTC m=+163.496879853" lastFinishedPulling="2026-04-24 14:26:36.866342622 +0000 UTC m=+164.992218318" observedRunningTime="2026-04-24 14:26:37.920796664 +0000 UTC m=+166.046672429" watchObservedRunningTime="2026-04-24 14:26:37.921153683 +0000 UTC m=+166.047029466" Apr 24 14:26:38.815788 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:38.815713 2570 patch_prober.go:28] interesting pod/image-registry-7cd9495d59-q64t7 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 14:26:38.815788 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:38.815766 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-7cd9495d59-q64t7" podUID="2a9e2553-3905-4307-bccc-cf5c6355779f" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:26:38.893007 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:38.892931 2570 generic.go:358] "Generic (PLEG): container finished" podID="98168d07-827e-4cd1-8fda-2885b13bff36" containerID="ae5c0a0f5eb4f1f3247df121feb09bb109915a2afa1bb86797e6fd4a1eb84594" exitCode=0 Apr 24 14:26:38.893455 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:38.893017 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"98168d07-827e-4cd1-8fda-2885b13bff36","Type":"ContainerDied","Data":"ae5c0a0f5eb4f1f3247df121feb09bb109915a2afa1bb86797e6fd4a1eb84594"} Apr 24 14:26:38.895105 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:38.895078 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-b9bf7858d-dwf2g" event={"ID":"bc1a78af-2e01-47bb-b73e-143d3e005ea0","Type":"ContainerStarted","Data":"6ce1ad7fce144e1871ef525ef328b8722aa491d65719bb40956b87eef09a5f99"} Apr 24 14:26:38.895238 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:38.895113 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-b9bf7858d-dwf2g" event={"ID":"bc1a78af-2e01-47bb-b73e-143d3e005ea0","Type":"ContainerStarted","Data":"26204b1707744ae132b7c3a5ce62c5f15485fc95d8b55f195636d9a26afc4f33"} Apr 24 14:26:38.895238 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:38.895124 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-b9bf7858d-dwf2g" event={"ID":"bc1a78af-2e01-47bb-b73e-143d3e005ea0","Type":"ContainerStarted","Data":"1fa40a8795c79ea30e55f454819feeefd433ce8e39a3efd4e99f2f3bb5879c39"} Apr 24 14:26:38.943139 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:38.942983 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-b9bf7858d-dwf2g" podStartSLOduration=1.789123164 podStartE2EDuration="3.94296506s" podCreationTimestamp="2026-04-24 14:26:35 +0000 UTC" firstStartedPulling="2026-04-24 14:26:36.29272747 +0000 UTC m=+164.418603167" lastFinishedPulling="2026-04-24 14:26:38.44656937 +0000 UTC m=+166.572445063" observedRunningTime="2026-04-24 14:26:38.936675725 +0000 UTC m=+167.062551431" watchObservedRunningTime="2026-04-24 14:26:38.94296506 +0000 UTC m=+167.068840766" Apr 24 14:26:39.413755 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:39.413710 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lkncg" Apr 24 14:26:39.416890 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:39.416873 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-84zkd\"" Apr 24 14:26:39.425153 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:39.425139 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lkncg" Apr 24 14:26:39.541849 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:39.541680 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-lkncg"] Apr 24 14:26:39.544268 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:26:39.544226 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ab7e98f_94b0_4d9b_9d46_6666f05fb86a.slice/crio-e3da969db5efef070810ae65b9f30c6eda2b8dc85fc3e8b358c4f55dcdaebfca WatchSource:0}: Error finding container e3da969db5efef070810ae65b9f30c6eda2b8dc85fc3e8b358c4f55dcdaebfca: Status 404 returned error can't find the container with id e3da969db5efef070810ae65b9f30c6eda2b8dc85fc3e8b358c4f55dcdaebfca Apr 24 14:26:39.899936 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:39.899898 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-lkncg" event={"ID":"7ab7e98f-94b0-4d9b-9d46-6666f05fb86a","Type":"ContainerStarted","Data":"e3da969db5efef070810ae65b9f30c6eda2b8dc85fc3e8b358c4f55dcdaebfca"} Apr 24 14:26:42.913262 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:42.913222 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"98168d07-827e-4cd1-8fda-2885b13bff36","Type":"ContainerStarted","Data":"4ca90c1c3f6cb2d00aa64e2eaf73082b812bab8208659214d27a74944077314d"} Apr 24 14:26:42.913692 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:42.913270 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"98168d07-827e-4cd1-8fda-2885b13bff36","Type":"ContainerStarted","Data":"63513425fce979b966a29e14fcd4621433f8a4423969afed9d2eada3ec212edd"} Apr 24 14:26:42.913692 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:42.913285 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"98168d07-827e-4cd1-8fda-2885b13bff36","Type":"ContainerStarted","Data":"e9c5d294308ae8db07f3e1e13b1ae329f2704ecdf2ff341de4769643ee6a2a10"} Apr 24 14:26:42.913692 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:42.913295 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"98168d07-827e-4cd1-8fda-2885b13bff36","Type":"ContainerStarted","Data":"9251d4cee610728a5518fa4869c2222e5399be14a3cc75914a2a0410aa933fef"} Apr 24 14:26:42.913692 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:42.913303 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"98168d07-827e-4cd1-8fda-2885b13bff36","Type":"ContainerStarted","Data":"9a3ac3296f4f3909a7a3ccac7d907090987ce2ce6eb768e1ca92e0e2abb9da23"} Apr 24 14:26:42.913692 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:42.913317 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"98168d07-827e-4cd1-8fda-2885b13bff36","Type":"ContainerStarted","Data":"c8b68c168915dac65203f96fc1ab902d93426848626db45586344e35c266202c"} Apr 24 14:26:42.914539 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:42.914516 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-lkncg" event={"ID":"7ab7e98f-94b0-4d9b-9d46-6666f05fb86a","Type":"ContainerStarted","Data":"54341e014d8eff6c06d4f3c4e14bb81030cee52784b363009eb6d6712b083c15"} Apr 24 14:26:42.942217 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:42.942165 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.013440663 podStartE2EDuration="6.942151545s" podCreationTimestamp="2026-04-24 14:26:36 +0000 UTC" firstStartedPulling="2026-04-24 14:26:37.206120358 +0000 UTC m=+165.331996040" lastFinishedPulling="2026-04-24 14:26:42.134831239 +0000 UTC m=+170.260706922" observedRunningTime="2026-04-24 14:26:42.939928969 +0000 UTC m=+171.065804675" watchObservedRunningTime="2026-04-24 14:26:42.942151545 +0000 UTC m=+171.068027250" Apr 24 14:26:42.954936 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:42.954888 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-lkncg" podStartSLOduration=135.370425765 podStartE2EDuration="2m17.954875107s" podCreationTimestamp="2026-04-24 14:24:25 +0000 UTC" firstStartedPulling="2026-04-24 14:26:39.54606001 +0000 UTC m=+167.671935692" lastFinishedPulling="2026-04-24 14:26:42.130509337 +0000 UTC m=+170.256385034" observedRunningTime="2026-04-24 14:26:42.954430129 +0000 UTC m=+171.080305834" watchObservedRunningTime="2026-04-24 14:26:42.954875107 +0000 UTC m=+171.080750812" Apr 24 14:26:43.413689 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:43.413651 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f5bf4" Apr 24 14:26:43.901108 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:43.901079 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-b99977797-jpktx" Apr 24 14:26:46.889998 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:46.889963 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-hs6xt" Apr 24 14:26:47.042464 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:47.042430 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:26:47.562998 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:47.562957 2570 patch_prober.go:28] interesting pod/image-registry-7cd9495d59-q64t7 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 14:26:47.563138 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:47.563009 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-7cd9495d59-q64t7" podUID="2a9e2553-3905-4307-bccc-cf5c6355779f" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:26:48.816590 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:48.816564 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7cd9495d59-q64t7" Apr 24 14:26:55.216947 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:55.216911 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-585c75f8cc-9sn69" Apr 24 14:26:55.217389 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:55.216957 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-585c75f8cc-9sn69" Apr 24 14:26:57.969239 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:57.969204 2570 generic.go:358] "Generic (PLEG): container finished" podID="01600e7e-779b-41ca-b62a-79288fc11666" containerID="f21aedf43ed61de8a2dd89bd2690dff04ba0d2f2cb1d666f131e23be22c84bdc" exitCode=0 Apr 24 14:26:57.969694 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:57.969258 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wtc7l" event={"ID":"01600e7e-779b-41ca-b62a-79288fc11666","Type":"ContainerDied","Data":"f21aedf43ed61de8a2dd89bd2690dff04ba0d2f2cb1d666f131e23be22c84bdc"} Apr 24 14:26:57.969694 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:57.969532 2570 scope.go:117] "RemoveContainer" containerID="f21aedf43ed61de8a2dd89bd2690dff04ba0d2f2cb1d666f131e23be22c84bdc" Apr 24 14:26:58.974088 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:26:58.974049 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wtc7l" event={"ID":"01600e7e-779b-41ca-b62a-79288fc11666","Type":"ContainerStarted","Data":"749738dd8fc05ad676429457c8a17013de450b6dffd87a7a2ea9cae3114ab87e"} Apr 24 14:27:08.001011 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:08.000974 2570 generic.go:358] "Generic (PLEG): container finished" podID="562c06c8-1f5f-456c-81f3-a499e9769f12" containerID="510d429e278f505e5ed7bf267e8fed911c3508e77305d9df3339e37ce89c6b3a" exitCode=0 Apr 24 14:27:08.001407 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:08.001049 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-hp8qf" event={"ID":"562c06c8-1f5f-456c-81f3-a499e9769f12","Type":"ContainerDied","Data":"510d429e278f505e5ed7bf267e8fed911c3508e77305d9df3339e37ce89c6b3a"} Apr 24 14:27:08.001407 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:08.001388 2570 scope.go:117] "RemoveContainer" containerID="510d429e278f505e5ed7bf267e8fed911c3508e77305d9df3339e37ce89c6b3a" Apr 24 14:27:09.004812 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:09.004780 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-hp8qf" event={"ID":"562c06c8-1f5f-456c-81f3-a499e9769f12","Type":"ContainerStarted","Data":"445e93253e07a7fa105515fefaa6568eb64dd7b6bfb222b165415a0889285ac1"} Apr 24 14:27:11.175031 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:11.175000 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-p284p_788ee4fa-ade7-40eb-ba34-7fa69f106caf/cluster-monitoring-operator/0.log" Apr 24 14:27:11.972898 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:11.972855 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-585c75f8cc-9sn69_38cf486e-f496-4330-87d5-dbbe5a60b7ce/metrics-server/0.log" Apr 24 14:27:12.972962 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:12.972935 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-s6ldx_f8125906-514c-4f62-9b0c-82f218af43f1/init-textfile/0.log" Apr 24 14:27:13.173485 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:13.173458 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-s6ldx_f8125906-514c-4f62-9b0c-82f218af43f1/node-exporter/0.log" Apr 24 14:27:13.373878 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:13.373831 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-s6ldx_f8125906-514c-4f62-9b0c-82f218af43f1/kube-rbac-proxy/0.log" Apr 24 14:27:14.773781 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:14.773715 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_98168d07-827e-4cd1-8fda-2885b13bff36/init-config-reloader/0.log" Apr 24 14:27:14.974511 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:14.974483 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_98168d07-827e-4cd1-8fda-2885b13bff36/prometheus/0.log" Apr 24 14:27:15.173780 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:15.173738 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_98168d07-827e-4cd1-8fda-2885b13bff36/config-reloader/0.log" Apr 24 14:27:15.223069 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:15.223039 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-585c75f8cc-9sn69" Apr 24 14:27:15.226826 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:15.226799 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-585c75f8cc-9sn69" Apr 24 14:27:15.373498 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:15.373469 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_98168d07-827e-4cd1-8fda-2885b13bff36/thanos-sidecar/0.log" Apr 24 14:27:15.573535 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:15.573424 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_98168d07-827e-4cd1-8fda-2885b13bff36/kube-rbac-proxy-web/0.log" Apr 24 14:27:15.773107 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:15.773079 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_98168d07-827e-4cd1-8fda-2885b13bff36/kube-rbac-proxy/0.log" Apr 24 14:27:15.973029 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:15.973003 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_98168d07-827e-4cd1-8fda-2885b13bff36/kube-rbac-proxy-thanos/0.log" Apr 24 14:27:16.773210 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:16.773177 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-b9bf7858d-dwf2g_bc1a78af-2e01-47bb-b73e-143d3e005ea0/telemeter-client/0.log" Apr 24 14:27:16.973660 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:16.973603 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-b9bf7858d-dwf2g_bc1a78af-2e01-47bb-b73e-143d3e005ea0/reload/0.log" Apr 24 14:27:17.173145 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:17.173111 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-b9bf7858d-dwf2g_bc1a78af-2e01-47bb-b73e-143d3e005ea0/kube-rbac-proxy/0.log" Apr 24 14:27:17.373760 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:17.373713 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b99977797-jpktx_4b0b822c-db44-478f-a5c4-349987078d8c/thanos-query/0.log" Apr 24 14:27:17.573774 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:17.573698 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b99977797-jpktx_4b0b822c-db44-478f-a5c4-349987078d8c/kube-rbac-proxy-web/0.log" Apr 24 14:27:17.773499 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:17.773467 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b99977797-jpktx_4b0b822c-db44-478f-a5c4-349987078d8c/kube-rbac-proxy/0.log" Apr 24 14:27:17.973497 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:17.973471 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b99977797-jpktx_4b0b822c-db44-478f-a5c4-349987078d8c/prom-label-proxy/0.log" Apr 24 14:27:18.173652 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:18.173606 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b99977797-jpktx_4b0b822c-db44-478f-a5c4-349987078d8c/kube-rbac-proxy-rules/0.log" Apr 24 14:27:18.373501 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:18.373472 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b99977797-jpktx_4b0b822c-db44-478f-a5c4-349987078d8c/kube-rbac-proxy-metrics/0.log" Apr 24 14:27:20.173903 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:20.173876 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-lkncg_7ab7e98f-94b0-4d9b-9d46-6666f05fb86a/serve-healthcheck-canary/0.log" Apr 24 14:27:37.042415 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:37.042379 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:37.061197 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:37.061162 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:37.103928 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:37.103902 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:55.083861 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.083827 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 14:27:55.084367 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.084296 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="98168d07-827e-4cd1-8fda-2885b13bff36" containerName="prometheus" containerID="cri-o://c8b68c168915dac65203f96fc1ab902d93426848626db45586344e35c266202c" gracePeriod=600 Apr 24 14:27:55.084367 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.084296 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="98168d07-827e-4cd1-8fda-2885b13bff36" containerName="thanos-sidecar" containerID="cri-o://9251d4cee610728a5518fa4869c2222e5399be14a3cc75914a2a0410aa933fef" gracePeriod=600 Apr 24 14:27:55.084367 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.084335 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="98168d07-827e-4cd1-8fda-2885b13bff36" containerName="config-reloader" containerID="cri-o://9a3ac3296f4f3909a7a3ccac7d907090987ce2ce6eb768e1ca92e0e2abb9da23" gracePeriod=600 Apr 24 14:27:55.084532 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.084359 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="98168d07-827e-4cd1-8fda-2885b13bff36" containerName="kube-rbac-proxy-web" containerID="cri-o://e9c5d294308ae8db07f3e1e13b1ae329f2704ecdf2ff341de4769643ee6a2a10" gracePeriod=600 Apr 24 14:27:55.084532 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.084331 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="98168d07-827e-4cd1-8fda-2885b13bff36" containerName="kube-rbac-proxy-thanos" containerID="cri-o://4ca90c1c3f6cb2d00aa64e2eaf73082b812bab8208659214d27a74944077314d" gracePeriod=600 Apr 24 14:27:55.084532 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.084299 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="98168d07-827e-4cd1-8fda-2885b13bff36" containerName="kube-rbac-proxy" containerID="cri-o://63513425fce979b966a29e14fcd4621433f8a4423969afed9d2eada3ec212edd" gracePeriod=600 Apr 24 14:27:55.325023 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.324998 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:55.434227 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.434179 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/98168d07-827e-4cd1-8fda-2885b13bff36-prometheus-k8s-rulefiles-0\") pod \"98168d07-827e-4cd1-8fda-2885b13bff36\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " Apr 24 14:27:55.434227 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.434236 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-thanos-prometheus-http-client-file\") pod \"98168d07-827e-4cd1-8fda-2885b13bff36\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " Apr 24 14:27:55.434501 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.434257 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-web-config\") pod \"98168d07-827e-4cd1-8fda-2885b13bff36\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " Apr 24 14:27:55.434501 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.434279 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/98168d07-827e-4cd1-8fda-2885b13bff36-configmap-metrics-client-ca\") pod \"98168d07-827e-4cd1-8fda-2885b13bff36\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " Apr 24 14:27:55.434501 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.434295 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbpk6\" (UniqueName: \"kubernetes.io/projected/98168d07-827e-4cd1-8fda-2885b13bff36-kube-api-access-fbpk6\") pod \"98168d07-827e-4cd1-8fda-2885b13bff36\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " Apr 24 14:27:55.434501 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.434328 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/98168d07-827e-4cd1-8fda-2885b13bff36-prometheus-k8s-db\") pod \"98168d07-827e-4cd1-8fda-2885b13bff36\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " Apr 24 14:27:55.434501 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.434361 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"98168d07-827e-4cd1-8fda-2885b13bff36\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " Apr 24 14:27:55.434501 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.434387 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/98168d07-827e-4cd1-8fda-2885b13bff36-config-out\") pod \"98168d07-827e-4cd1-8fda-2885b13bff36\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " Apr 24 14:27:55.434501 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.434416 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98168d07-827e-4cd1-8fda-2885b13bff36-prometheus-trusted-ca-bundle\") pod \"98168d07-827e-4cd1-8fda-2885b13bff36\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " Apr 24 14:27:55.434501 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.434458 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98168d07-827e-4cd1-8fda-2885b13bff36-configmap-kubelet-serving-ca-bundle\") pod \"98168d07-827e-4cd1-8fda-2885b13bff36\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " Apr 24 14:27:55.434923 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.434506 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-secret-kube-rbac-proxy\") pod \"98168d07-827e-4cd1-8fda-2885b13bff36\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " Apr 24 14:27:55.434923 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.434532 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-secret-grpc-tls\") pod \"98168d07-827e-4cd1-8fda-2885b13bff36\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " Apr 24 14:27:55.434923 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.434560 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/98168d07-827e-4cd1-8fda-2885b13bff36-tls-assets\") pod \"98168d07-827e-4cd1-8fda-2885b13bff36\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " Apr 24 14:27:55.434923 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.434586 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-config\") pod \"98168d07-827e-4cd1-8fda-2885b13bff36\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " Apr 24 14:27:55.434923 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.434614 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-secret-metrics-client-certs\") pod \"98168d07-827e-4cd1-8fda-2885b13bff36\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " Apr 24 14:27:55.434923 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.434674 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98168d07-827e-4cd1-8fda-2885b13bff36-configmap-serving-certs-ca-bundle\") pod \"98168d07-827e-4cd1-8fda-2885b13bff36\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " Apr 24 14:27:55.434923 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.434723 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-secret-prometheus-k8s-tls\") pod \"98168d07-827e-4cd1-8fda-2885b13bff36\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " Apr 24 14:27:55.434923 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.434749 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"98168d07-827e-4cd1-8fda-2885b13bff36\" (UID: \"98168d07-827e-4cd1-8fda-2885b13bff36\") " Apr 24 14:27:55.434923 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.434853 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98168d07-827e-4cd1-8fda-2885b13bff36-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "98168d07-827e-4cd1-8fda-2885b13bff36" (UID: "98168d07-827e-4cd1-8fda-2885b13bff36"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:27:55.434923 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.434945 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98168d07-827e-4cd1-8fda-2885b13bff36-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "98168d07-827e-4cd1-8fda-2885b13bff36" (UID: "98168d07-827e-4cd1-8fda-2885b13bff36"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:27:55.435649 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.435264 2570 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/98168d07-827e-4cd1-8fda-2885b13bff36-configmap-metrics-client-ca\") on node \"ip-10-0-129-77.ec2.internal\" DevicePath \"\"" Apr 24 14:27:55.435649 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.435286 2570 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98168d07-827e-4cd1-8fda-2885b13bff36-prometheus-trusted-ca-bundle\") on node \"ip-10-0-129-77.ec2.internal\" DevicePath \"\"" Apr 24 14:27:55.435885 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.435832 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98168d07-827e-4cd1-8fda-2885b13bff36-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "98168d07-827e-4cd1-8fda-2885b13bff36" (UID: "98168d07-827e-4cd1-8fda-2885b13bff36"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:27:55.436100 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.436078 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98168d07-827e-4cd1-8fda-2885b13bff36-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "98168d07-827e-4cd1-8fda-2885b13bff36" (UID: "98168d07-827e-4cd1-8fda-2885b13bff36"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:27:55.438220 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.438188 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98168d07-827e-4cd1-8fda-2885b13bff36-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "98168d07-827e-4cd1-8fda-2885b13bff36" (UID: "98168d07-827e-4cd1-8fda-2885b13bff36"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:27:55.438321 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.438275 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98168d07-827e-4cd1-8fda-2885b13bff36-kube-api-access-fbpk6" (OuterVolumeSpecName: "kube-api-access-fbpk6") pod "98168d07-827e-4cd1-8fda-2885b13bff36" (UID: "98168d07-827e-4cd1-8fda-2885b13bff36"). InnerVolumeSpecName "kube-api-access-fbpk6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:27:55.438393 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.438357 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "98168d07-827e-4cd1-8fda-2885b13bff36" (UID: "98168d07-827e-4cd1-8fda-2885b13bff36"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:27:55.438810 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.438781 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98168d07-827e-4cd1-8fda-2885b13bff36-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "98168d07-827e-4cd1-8fda-2885b13bff36" (UID: "98168d07-827e-4cd1-8fda-2885b13bff36"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:27:55.438810 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.438800 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98168d07-827e-4cd1-8fda-2885b13bff36-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "98168d07-827e-4cd1-8fda-2885b13bff36" (UID: "98168d07-827e-4cd1-8fda-2885b13bff36"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:27:55.438962 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.438873 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-config" (OuterVolumeSpecName: "config") pod "98168d07-827e-4cd1-8fda-2885b13bff36" (UID: "98168d07-827e-4cd1-8fda-2885b13bff36"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:27:55.438962 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.438897 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "98168d07-827e-4cd1-8fda-2885b13bff36" (UID: "98168d07-827e-4cd1-8fda-2885b13bff36"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:27:55.438962 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.438932 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98168d07-827e-4cd1-8fda-2885b13bff36-config-out" (OuterVolumeSpecName: "config-out") pod "98168d07-827e-4cd1-8fda-2885b13bff36" (UID: "98168d07-827e-4cd1-8fda-2885b13bff36"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:27:55.439573 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.439537 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "98168d07-827e-4cd1-8fda-2885b13bff36" (UID: "98168d07-827e-4cd1-8fda-2885b13bff36"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:27:55.439702 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.439592 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "98168d07-827e-4cd1-8fda-2885b13bff36" (UID: "98168d07-827e-4cd1-8fda-2885b13bff36"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:27:55.440008 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.439987 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "98168d07-827e-4cd1-8fda-2885b13bff36" (UID: "98168d07-827e-4cd1-8fda-2885b13bff36"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:27:55.440676 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.440655 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "98168d07-827e-4cd1-8fda-2885b13bff36" (UID: "98168d07-827e-4cd1-8fda-2885b13bff36"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:27:55.440808 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.440786 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "98168d07-827e-4cd1-8fda-2885b13bff36" (UID: "98168d07-827e-4cd1-8fda-2885b13bff36"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:27:55.449459 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.449431 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-web-config" (OuterVolumeSpecName: "web-config") pod "98168d07-827e-4cd1-8fda-2885b13bff36" (UID: "98168d07-827e-4cd1-8fda-2885b13bff36"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:27:55.536321 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.536282 2570 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-secret-kube-rbac-proxy\") on node \"ip-10-0-129-77.ec2.internal\" DevicePath \"\"" Apr 24 14:27:55.536321 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.536315 2570 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-secret-grpc-tls\") on node \"ip-10-0-129-77.ec2.internal\" DevicePath \"\"" Apr 24 14:27:55.536321 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.536325 2570 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/98168d07-827e-4cd1-8fda-2885b13bff36-tls-assets\") on node \"ip-10-0-129-77.ec2.internal\" DevicePath \"\"" Apr 24 14:27:55.536555 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.536335 2570 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-config\") on node \"ip-10-0-129-77.ec2.internal\" DevicePath \"\"" Apr 24 14:27:55.536555 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.536344 2570 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-secret-metrics-client-certs\") on node \"ip-10-0-129-77.ec2.internal\" DevicePath \"\"" Apr 24 14:27:55.536555 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.536353 2570 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98168d07-827e-4cd1-8fda-2885b13bff36-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-129-77.ec2.internal\" DevicePath \"\"" Apr 24 14:27:55.536555 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.536363 2570 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-secret-prometheus-k8s-tls\") on node \"ip-10-0-129-77.ec2.internal\" DevicePath \"\"" Apr 24 14:27:55.536555 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.536372 2570 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-129-77.ec2.internal\" DevicePath \"\"" Apr 24 14:27:55.536555 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.536381 2570 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/98168d07-827e-4cd1-8fda-2885b13bff36-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-129-77.ec2.internal\" DevicePath \"\"" Apr 24 14:27:55.536555 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.536390 2570 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-thanos-prometheus-http-client-file\") on node \"ip-10-0-129-77.ec2.internal\" DevicePath \"\"" Apr 24 14:27:55.536555 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.536399 2570 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-web-config\") on node \"ip-10-0-129-77.ec2.internal\" DevicePath \"\"" Apr 24 14:27:55.536555 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.536408 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fbpk6\" (UniqueName: \"kubernetes.io/projected/98168d07-827e-4cd1-8fda-2885b13bff36-kube-api-access-fbpk6\") on node \"ip-10-0-129-77.ec2.internal\" DevicePath \"\"" Apr 24 14:27:55.536555 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.536416 2570 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/98168d07-827e-4cd1-8fda-2885b13bff36-prometheus-k8s-db\") on node \"ip-10-0-129-77.ec2.internal\" DevicePath \"\"" Apr 24 14:27:55.536555 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.536424 2570 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/98168d07-827e-4cd1-8fda-2885b13bff36-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-129-77.ec2.internal\" DevicePath \"\"" Apr 24 14:27:55.536555 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.536442 2570 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/98168d07-827e-4cd1-8fda-2885b13bff36-config-out\") on node \"ip-10-0-129-77.ec2.internal\" DevicePath \"\"" Apr 24 14:27:55.536555 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:55.536452 2570 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98168d07-827e-4cd1-8fda-2885b13bff36-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-129-77.ec2.internal\" DevicePath \"\"" Apr 24 14:27:56.148764 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.148732 2570 generic.go:358] "Generic (PLEG): container finished" podID="98168d07-827e-4cd1-8fda-2885b13bff36" containerID="4ca90c1c3f6cb2d00aa64e2eaf73082b812bab8208659214d27a74944077314d" exitCode=0 Apr 24 14:27:56.148764 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.148758 2570 generic.go:358] "Generic (PLEG): container finished" podID="98168d07-827e-4cd1-8fda-2885b13bff36" containerID="63513425fce979b966a29e14fcd4621433f8a4423969afed9d2eada3ec212edd" exitCode=0 Apr 24 14:27:56.148764 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.148764 2570 generic.go:358] "Generic (PLEG): container finished" podID="98168d07-827e-4cd1-8fda-2885b13bff36" containerID="e9c5d294308ae8db07f3e1e13b1ae329f2704ecdf2ff341de4769643ee6a2a10" exitCode=0 Apr 24 14:27:56.148764 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.148769 2570 generic.go:358] "Generic (PLEG): container finished" podID="98168d07-827e-4cd1-8fda-2885b13bff36" containerID="9251d4cee610728a5518fa4869c2222e5399be14a3cc75914a2a0410aa933fef" exitCode=0 Apr 24 14:27:56.148764 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.148774 2570 generic.go:358] "Generic (PLEG): container finished" podID="98168d07-827e-4cd1-8fda-2885b13bff36" containerID="9a3ac3296f4f3909a7a3ccac7d907090987ce2ce6eb768e1ca92e0e2abb9da23" exitCode=0 Apr 24 14:27:56.149313 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.148779 2570 generic.go:358] "Generic (PLEG): container finished" podID="98168d07-827e-4cd1-8fda-2885b13bff36" containerID="c8b68c168915dac65203f96fc1ab902d93426848626db45586344e35c266202c" exitCode=0 Apr 24 14:27:56.149313 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.148819 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"98168d07-827e-4cd1-8fda-2885b13bff36","Type":"ContainerDied","Data":"4ca90c1c3f6cb2d00aa64e2eaf73082b812bab8208659214d27a74944077314d"} Apr 24 14:27:56.149313 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.148843 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.149313 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.148861 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"98168d07-827e-4cd1-8fda-2885b13bff36","Type":"ContainerDied","Data":"63513425fce979b966a29e14fcd4621433f8a4423969afed9d2eada3ec212edd"} Apr 24 14:27:56.149313 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.148877 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"98168d07-827e-4cd1-8fda-2885b13bff36","Type":"ContainerDied","Data":"e9c5d294308ae8db07f3e1e13b1ae329f2704ecdf2ff341de4769643ee6a2a10"} Apr 24 14:27:56.149313 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.148890 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"98168d07-827e-4cd1-8fda-2885b13bff36","Type":"ContainerDied","Data":"9251d4cee610728a5518fa4869c2222e5399be14a3cc75914a2a0410aa933fef"} Apr 24 14:27:56.149313 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.148903 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"98168d07-827e-4cd1-8fda-2885b13bff36","Type":"ContainerDied","Data":"9a3ac3296f4f3909a7a3ccac7d907090987ce2ce6eb768e1ca92e0e2abb9da23"} Apr 24 14:27:56.149313 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.148916 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"98168d07-827e-4cd1-8fda-2885b13bff36","Type":"ContainerDied","Data":"c8b68c168915dac65203f96fc1ab902d93426848626db45586344e35c266202c"} Apr 24 14:27:56.149313 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.148923 2570 scope.go:117] "RemoveContainer" containerID="4ca90c1c3f6cb2d00aa64e2eaf73082b812bab8208659214d27a74944077314d" Apr 24 14:27:56.149313 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.148929 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"98168d07-827e-4cd1-8fda-2885b13bff36","Type":"ContainerDied","Data":"ede70a465886049635bb8f628fc0a6fde8dc964369e46f39e18300b0d2f1660a"} Apr 24 14:27:56.160203 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.160178 2570 scope.go:117] "RemoveContainer" containerID="63513425fce979b966a29e14fcd4621433f8a4423969afed9d2eada3ec212edd" Apr 24 14:27:56.167819 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.167801 2570 scope.go:117] "RemoveContainer" containerID="e9c5d294308ae8db07f3e1e13b1ae329f2704ecdf2ff341de4769643ee6a2a10" Apr 24 14:27:56.174431 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.174414 2570 scope.go:117] "RemoveContainer" containerID="9251d4cee610728a5518fa4869c2222e5399be14a3cc75914a2a0410aa933fef" Apr 24 14:27:56.179524 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.179502 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 14:27:56.181150 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.181128 2570 scope.go:117] "RemoveContainer" containerID="9a3ac3296f4f3909a7a3ccac7d907090987ce2ce6eb768e1ca92e0e2abb9da23" Apr 24 14:27:56.184672 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.184650 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 14:27:56.187855 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.187838 2570 scope.go:117] "RemoveContainer" containerID="c8b68c168915dac65203f96fc1ab902d93426848626db45586344e35c266202c" Apr 24 14:27:56.194475 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.194454 2570 scope.go:117] "RemoveContainer" containerID="ae5c0a0f5eb4f1f3247df121feb09bb109915a2afa1bb86797e6fd4a1eb84594" Apr 24 14:27:56.200595 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.200580 2570 scope.go:117] "RemoveContainer" containerID="4ca90c1c3f6cb2d00aa64e2eaf73082b812bab8208659214d27a74944077314d" Apr 24 14:27:56.200873 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:27:56.200851 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ca90c1c3f6cb2d00aa64e2eaf73082b812bab8208659214d27a74944077314d\": container with ID starting with 4ca90c1c3f6cb2d00aa64e2eaf73082b812bab8208659214d27a74944077314d not found: ID does not exist" containerID="4ca90c1c3f6cb2d00aa64e2eaf73082b812bab8208659214d27a74944077314d" Apr 24 14:27:56.200930 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.200882 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ca90c1c3f6cb2d00aa64e2eaf73082b812bab8208659214d27a74944077314d"} err="failed to get container status \"4ca90c1c3f6cb2d00aa64e2eaf73082b812bab8208659214d27a74944077314d\": rpc error: code = NotFound desc = could not find container \"4ca90c1c3f6cb2d00aa64e2eaf73082b812bab8208659214d27a74944077314d\": container with ID starting with 4ca90c1c3f6cb2d00aa64e2eaf73082b812bab8208659214d27a74944077314d not found: ID does not exist" Apr 24 14:27:56.200930 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.200915 2570 scope.go:117] "RemoveContainer" containerID="63513425fce979b966a29e14fcd4621433f8a4423969afed9d2eada3ec212edd" Apr 24 14:27:56.201134 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:27:56.201115 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63513425fce979b966a29e14fcd4621433f8a4423969afed9d2eada3ec212edd\": container with ID starting with 63513425fce979b966a29e14fcd4621433f8a4423969afed9d2eada3ec212edd not found: ID does not exist" containerID="63513425fce979b966a29e14fcd4621433f8a4423969afed9d2eada3ec212edd" Apr 24 14:27:56.201201 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.201143 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63513425fce979b966a29e14fcd4621433f8a4423969afed9d2eada3ec212edd"} err="failed to get container status \"63513425fce979b966a29e14fcd4621433f8a4423969afed9d2eada3ec212edd\": rpc error: code = NotFound desc = could not find container \"63513425fce979b966a29e14fcd4621433f8a4423969afed9d2eada3ec212edd\": container with ID starting with 63513425fce979b966a29e14fcd4621433f8a4423969afed9d2eada3ec212edd not found: ID does not exist" Apr 24 14:27:56.201201 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.201167 2570 scope.go:117] "RemoveContainer" containerID="e9c5d294308ae8db07f3e1e13b1ae329f2704ecdf2ff341de4769643ee6a2a10" Apr 24 14:27:56.201418 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:27:56.201402 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9c5d294308ae8db07f3e1e13b1ae329f2704ecdf2ff341de4769643ee6a2a10\": container with ID starting with e9c5d294308ae8db07f3e1e13b1ae329f2704ecdf2ff341de4769643ee6a2a10 not found: ID does not exist" containerID="e9c5d294308ae8db07f3e1e13b1ae329f2704ecdf2ff341de4769643ee6a2a10" Apr 24 14:27:56.201452 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.201423 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9c5d294308ae8db07f3e1e13b1ae329f2704ecdf2ff341de4769643ee6a2a10"} err="failed to get container status \"e9c5d294308ae8db07f3e1e13b1ae329f2704ecdf2ff341de4769643ee6a2a10\": rpc error: code = NotFound desc = could not find container \"e9c5d294308ae8db07f3e1e13b1ae329f2704ecdf2ff341de4769643ee6a2a10\": container with ID starting with e9c5d294308ae8db07f3e1e13b1ae329f2704ecdf2ff341de4769643ee6a2a10 not found: ID does not exist" Apr 24 14:27:56.201452 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.201437 2570 scope.go:117] "RemoveContainer" containerID="9251d4cee610728a5518fa4869c2222e5399be14a3cc75914a2a0410aa933fef" Apr 24 14:27:56.201687 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:27:56.201666 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9251d4cee610728a5518fa4869c2222e5399be14a3cc75914a2a0410aa933fef\": container with ID starting with 9251d4cee610728a5518fa4869c2222e5399be14a3cc75914a2a0410aa933fef not found: ID does not exist" containerID="9251d4cee610728a5518fa4869c2222e5399be14a3cc75914a2a0410aa933fef" Apr 24 14:27:56.201742 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.201693 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9251d4cee610728a5518fa4869c2222e5399be14a3cc75914a2a0410aa933fef"} err="failed to get container status \"9251d4cee610728a5518fa4869c2222e5399be14a3cc75914a2a0410aa933fef\": rpc error: code = NotFound desc = could not find container \"9251d4cee610728a5518fa4869c2222e5399be14a3cc75914a2a0410aa933fef\": container with ID starting with 9251d4cee610728a5518fa4869c2222e5399be14a3cc75914a2a0410aa933fef not found: ID does not exist" Apr 24 14:27:56.201742 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.201713 2570 scope.go:117] "RemoveContainer" containerID="9a3ac3296f4f3909a7a3ccac7d907090987ce2ce6eb768e1ca92e0e2abb9da23" Apr 24 14:27:56.201940 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:27:56.201925 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a3ac3296f4f3909a7a3ccac7d907090987ce2ce6eb768e1ca92e0e2abb9da23\": container with ID starting with 9a3ac3296f4f3909a7a3ccac7d907090987ce2ce6eb768e1ca92e0e2abb9da23 not found: ID does not exist" containerID="9a3ac3296f4f3909a7a3ccac7d907090987ce2ce6eb768e1ca92e0e2abb9da23" Apr 24 14:27:56.201981 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.201943 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a3ac3296f4f3909a7a3ccac7d907090987ce2ce6eb768e1ca92e0e2abb9da23"} err="failed to get container status \"9a3ac3296f4f3909a7a3ccac7d907090987ce2ce6eb768e1ca92e0e2abb9da23\": rpc error: code = NotFound desc = could not find container \"9a3ac3296f4f3909a7a3ccac7d907090987ce2ce6eb768e1ca92e0e2abb9da23\": container with ID starting with 9a3ac3296f4f3909a7a3ccac7d907090987ce2ce6eb768e1ca92e0e2abb9da23 not found: ID does not exist" Apr 24 14:27:56.201981 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.201956 2570 scope.go:117] "RemoveContainer" containerID="c8b68c168915dac65203f96fc1ab902d93426848626db45586344e35c266202c" Apr 24 14:27:56.202171 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:27:56.202154 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8b68c168915dac65203f96fc1ab902d93426848626db45586344e35c266202c\": container with ID starting with c8b68c168915dac65203f96fc1ab902d93426848626db45586344e35c266202c not found: ID does not exist" containerID="c8b68c168915dac65203f96fc1ab902d93426848626db45586344e35c266202c" Apr 24 14:27:56.202231 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.202178 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8b68c168915dac65203f96fc1ab902d93426848626db45586344e35c266202c"} err="failed to get container status \"c8b68c168915dac65203f96fc1ab902d93426848626db45586344e35c266202c\": rpc error: code = NotFound desc = could not find container \"c8b68c168915dac65203f96fc1ab902d93426848626db45586344e35c266202c\": container with ID starting with c8b68c168915dac65203f96fc1ab902d93426848626db45586344e35c266202c not found: ID does not exist" Apr 24 14:27:56.202231 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.202198 2570 scope.go:117] "RemoveContainer" containerID="ae5c0a0f5eb4f1f3247df121feb09bb109915a2afa1bb86797e6fd4a1eb84594" Apr 24 14:27:56.202409 ip-10-0-129-77 kubenswrapper[2570]: E0424 14:27:56.202394 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae5c0a0f5eb4f1f3247df121feb09bb109915a2afa1bb86797e6fd4a1eb84594\": container with ID starting with ae5c0a0f5eb4f1f3247df121feb09bb109915a2afa1bb86797e6fd4a1eb84594 not found: ID does not exist" containerID="ae5c0a0f5eb4f1f3247df121feb09bb109915a2afa1bb86797e6fd4a1eb84594" Apr 24 14:27:56.202470 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.202414 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae5c0a0f5eb4f1f3247df121feb09bb109915a2afa1bb86797e6fd4a1eb84594"} err="failed to get container status \"ae5c0a0f5eb4f1f3247df121feb09bb109915a2afa1bb86797e6fd4a1eb84594\": rpc error: code = NotFound desc = could not find container \"ae5c0a0f5eb4f1f3247df121feb09bb109915a2afa1bb86797e6fd4a1eb84594\": container with ID starting with ae5c0a0f5eb4f1f3247df121feb09bb109915a2afa1bb86797e6fd4a1eb84594 not found: ID does not exist" Apr 24 14:27:56.202470 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.202434 2570 scope.go:117] "RemoveContainer" containerID="4ca90c1c3f6cb2d00aa64e2eaf73082b812bab8208659214d27a74944077314d" Apr 24 14:27:56.202644 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.202608 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ca90c1c3f6cb2d00aa64e2eaf73082b812bab8208659214d27a74944077314d"} err="failed to get container status \"4ca90c1c3f6cb2d00aa64e2eaf73082b812bab8208659214d27a74944077314d\": rpc error: code = NotFound desc = could not find container \"4ca90c1c3f6cb2d00aa64e2eaf73082b812bab8208659214d27a74944077314d\": container with ID starting with 4ca90c1c3f6cb2d00aa64e2eaf73082b812bab8208659214d27a74944077314d not found: ID does not exist" Apr 24 14:27:56.202694 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.202645 2570 scope.go:117] "RemoveContainer" containerID="63513425fce979b966a29e14fcd4621433f8a4423969afed9d2eada3ec212edd" Apr 24 14:27:56.202871 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.202854 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63513425fce979b966a29e14fcd4621433f8a4423969afed9d2eada3ec212edd"} err="failed to get container status \"63513425fce979b966a29e14fcd4621433f8a4423969afed9d2eada3ec212edd\": rpc error: code = NotFound desc = could not find container \"63513425fce979b966a29e14fcd4621433f8a4423969afed9d2eada3ec212edd\": container with ID starting with 63513425fce979b966a29e14fcd4621433f8a4423969afed9d2eada3ec212edd not found: ID does not exist" Apr 24 14:27:56.202937 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.202871 2570 scope.go:117] "RemoveContainer" containerID="e9c5d294308ae8db07f3e1e13b1ae329f2704ecdf2ff341de4769643ee6a2a10" Apr 24 14:27:56.203080 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.203064 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9c5d294308ae8db07f3e1e13b1ae329f2704ecdf2ff341de4769643ee6a2a10"} err="failed to get container status \"e9c5d294308ae8db07f3e1e13b1ae329f2704ecdf2ff341de4769643ee6a2a10\": rpc error: code = NotFound desc = could not find container \"e9c5d294308ae8db07f3e1e13b1ae329f2704ecdf2ff341de4769643ee6a2a10\": container with ID starting with e9c5d294308ae8db07f3e1e13b1ae329f2704ecdf2ff341de4769643ee6a2a10 not found: ID does not exist" Apr 24 14:27:56.203138 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.203081 2570 scope.go:117] "RemoveContainer" containerID="9251d4cee610728a5518fa4869c2222e5399be14a3cc75914a2a0410aa933fef" Apr 24 14:27:56.203291 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.203273 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9251d4cee610728a5518fa4869c2222e5399be14a3cc75914a2a0410aa933fef"} err="failed to get container status \"9251d4cee610728a5518fa4869c2222e5399be14a3cc75914a2a0410aa933fef\": rpc error: code = NotFound desc = could not find container \"9251d4cee610728a5518fa4869c2222e5399be14a3cc75914a2a0410aa933fef\": container with ID starting with 9251d4cee610728a5518fa4869c2222e5399be14a3cc75914a2a0410aa933fef not found: ID does not exist" Apr 24 14:27:56.203335 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.203291 2570 scope.go:117] "RemoveContainer" containerID="9a3ac3296f4f3909a7a3ccac7d907090987ce2ce6eb768e1ca92e0e2abb9da23" Apr 24 14:27:56.203504 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.203488 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a3ac3296f4f3909a7a3ccac7d907090987ce2ce6eb768e1ca92e0e2abb9da23"} err="failed to get container status \"9a3ac3296f4f3909a7a3ccac7d907090987ce2ce6eb768e1ca92e0e2abb9da23\": rpc error: code = NotFound desc = could not find container \"9a3ac3296f4f3909a7a3ccac7d907090987ce2ce6eb768e1ca92e0e2abb9da23\": container with ID starting with 9a3ac3296f4f3909a7a3ccac7d907090987ce2ce6eb768e1ca92e0e2abb9da23 not found: ID does not exist" Apr 24 14:27:56.203568 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.203505 2570 scope.go:117] "RemoveContainer" containerID="c8b68c168915dac65203f96fc1ab902d93426848626db45586344e35c266202c" Apr 24 14:27:56.203844 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.203826 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8b68c168915dac65203f96fc1ab902d93426848626db45586344e35c266202c"} err="failed to get container status \"c8b68c168915dac65203f96fc1ab902d93426848626db45586344e35c266202c\": rpc error: code = NotFound desc = could not find container \"c8b68c168915dac65203f96fc1ab902d93426848626db45586344e35c266202c\": container with ID starting with c8b68c168915dac65203f96fc1ab902d93426848626db45586344e35c266202c not found: ID does not exist" Apr 24 14:27:56.203844 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.203844 2570 scope.go:117] "RemoveContainer" containerID="ae5c0a0f5eb4f1f3247df121feb09bb109915a2afa1bb86797e6fd4a1eb84594" Apr 24 14:27:56.204071 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.204048 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae5c0a0f5eb4f1f3247df121feb09bb109915a2afa1bb86797e6fd4a1eb84594"} err="failed to get container status \"ae5c0a0f5eb4f1f3247df121feb09bb109915a2afa1bb86797e6fd4a1eb84594\": rpc error: code = NotFound desc = could not find container \"ae5c0a0f5eb4f1f3247df121feb09bb109915a2afa1bb86797e6fd4a1eb84594\": container with ID starting with ae5c0a0f5eb4f1f3247df121feb09bb109915a2afa1bb86797e6fd4a1eb84594 not found: ID does not exist" Apr 24 14:27:56.204071 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.204071 2570 scope.go:117] "RemoveContainer" containerID="4ca90c1c3f6cb2d00aa64e2eaf73082b812bab8208659214d27a74944077314d" Apr 24 14:27:56.204318 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.204298 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ca90c1c3f6cb2d00aa64e2eaf73082b812bab8208659214d27a74944077314d"} err="failed to get container status \"4ca90c1c3f6cb2d00aa64e2eaf73082b812bab8208659214d27a74944077314d\": rpc error: code = NotFound desc = could not find container \"4ca90c1c3f6cb2d00aa64e2eaf73082b812bab8208659214d27a74944077314d\": container with ID starting with 4ca90c1c3f6cb2d00aa64e2eaf73082b812bab8208659214d27a74944077314d not found: ID does not exist" Apr 24 14:27:56.204368 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.204318 2570 scope.go:117] "RemoveContainer" containerID="63513425fce979b966a29e14fcd4621433f8a4423969afed9d2eada3ec212edd" Apr 24 14:27:56.204509 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.204492 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63513425fce979b966a29e14fcd4621433f8a4423969afed9d2eada3ec212edd"} err="failed to get container status \"63513425fce979b966a29e14fcd4621433f8a4423969afed9d2eada3ec212edd\": rpc error: code = NotFound desc = could not find container \"63513425fce979b966a29e14fcd4621433f8a4423969afed9d2eada3ec212edd\": container with ID starting with 63513425fce979b966a29e14fcd4621433f8a4423969afed9d2eada3ec212edd not found: ID does not exist" Apr 24 14:27:56.204551 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.204509 2570 scope.go:117] "RemoveContainer" containerID="e9c5d294308ae8db07f3e1e13b1ae329f2704ecdf2ff341de4769643ee6a2a10" Apr 24 14:27:56.204717 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.204702 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9c5d294308ae8db07f3e1e13b1ae329f2704ecdf2ff341de4769643ee6a2a10"} err="failed to get container status \"e9c5d294308ae8db07f3e1e13b1ae329f2704ecdf2ff341de4769643ee6a2a10\": rpc error: code = NotFound desc = could not find container \"e9c5d294308ae8db07f3e1e13b1ae329f2704ecdf2ff341de4769643ee6a2a10\": container with ID starting with e9c5d294308ae8db07f3e1e13b1ae329f2704ecdf2ff341de4769643ee6a2a10 not found: ID does not exist" Apr 24 14:27:56.204757 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.204718 2570 scope.go:117] "RemoveContainer" containerID="9251d4cee610728a5518fa4869c2222e5399be14a3cc75914a2a0410aa933fef" Apr 24 14:27:56.204897 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.204881 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9251d4cee610728a5518fa4869c2222e5399be14a3cc75914a2a0410aa933fef"} err="failed to get container status \"9251d4cee610728a5518fa4869c2222e5399be14a3cc75914a2a0410aa933fef\": rpc error: code = NotFound desc = could not find container \"9251d4cee610728a5518fa4869c2222e5399be14a3cc75914a2a0410aa933fef\": container with ID starting with 9251d4cee610728a5518fa4869c2222e5399be14a3cc75914a2a0410aa933fef not found: ID does not exist" Apr 24 14:27:56.204940 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.204896 2570 scope.go:117] "RemoveContainer" containerID="9a3ac3296f4f3909a7a3ccac7d907090987ce2ce6eb768e1ca92e0e2abb9da23" Apr 24 14:27:56.205065 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.205051 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a3ac3296f4f3909a7a3ccac7d907090987ce2ce6eb768e1ca92e0e2abb9da23"} err="failed to get container status \"9a3ac3296f4f3909a7a3ccac7d907090987ce2ce6eb768e1ca92e0e2abb9da23\": rpc error: code = NotFound desc = could not find container \"9a3ac3296f4f3909a7a3ccac7d907090987ce2ce6eb768e1ca92e0e2abb9da23\": container with ID starting with 9a3ac3296f4f3909a7a3ccac7d907090987ce2ce6eb768e1ca92e0e2abb9da23 not found: ID does not exist" Apr 24 14:27:56.205112 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.205065 2570 scope.go:117] "RemoveContainer" containerID="c8b68c168915dac65203f96fc1ab902d93426848626db45586344e35c266202c" Apr 24 14:27:56.205229 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.205212 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8b68c168915dac65203f96fc1ab902d93426848626db45586344e35c266202c"} err="failed to get container status \"c8b68c168915dac65203f96fc1ab902d93426848626db45586344e35c266202c\": rpc error: code = NotFound desc = could not find container \"c8b68c168915dac65203f96fc1ab902d93426848626db45586344e35c266202c\": container with ID starting with c8b68c168915dac65203f96fc1ab902d93426848626db45586344e35c266202c not found: ID does not exist" Apr 24 14:27:56.205229 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.205227 2570 scope.go:117] "RemoveContainer" containerID="ae5c0a0f5eb4f1f3247df121feb09bb109915a2afa1bb86797e6fd4a1eb84594" Apr 24 14:27:56.205437 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.205421 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae5c0a0f5eb4f1f3247df121feb09bb109915a2afa1bb86797e6fd4a1eb84594"} err="failed to get container status \"ae5c0a0f5eb4f1f3247df121feb09bb109915a2afa1bb86797e6fd4a1eb84594\": rpc error: code = NotFound desc = could not find container \"ae5c0a0f5eb4f1f3247df121feb09bb109915a2afa1bb86797e6fd4a1eb84594\": container with ID starting with ae5c0a0f5eb4f1f3247df121feb09bb109915a2afa1bb86797e6fd4a1eb84594 not found: ID does not exist" Apr 24 14:27:56.205437 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.205436 2570 scope.go:117] "RemoveContainer" containerID="4ca90c1c3f6cb2d00aa64e2eaf73082b812bab8208659214d27a74944077314d" Apr 24 14:27:56.205673 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.205654 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ca90c1c3f6cb2d00aa64e2eaf73082b812bab8208659214d27a74944077314d"} err="failed to get container status \"4ca90c1c3f6cb2d00aa64e2eaf73082b812bab8208659214d27a74944077314d\": rpc error: code = NotFound desc = could not find container \"4ca90c1c3f6cb2d00aa64e2eaf73082b812bab8208659214d27a74944077314d\": container with ID starting with 4ca90c1c3f6cb2d00aa64e2eaf73082b812bab8208659214d27a74944077314d not found: ID does not exist" Apr 24 14:27:56.205728 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.205674 2570 scope.go:117] "RemoveContainer" containerID="63513425fce979b966a29e14fcd4621433f8a4423969afed9d2eada3ec212edd" Apr 24 14:27:56.205934 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.205916 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63513425fce979b966a29e14fcd4621433f8a4423969afed9d2eada3ec212edd"} err="failed to get container status \"63513425fce979b966a29e14fcd4621433f8a4423969afed9d2eada3ec212edd\": rpc error: code = NotFound desc = could not find container \"63513425fce979b966a29e14fcd4621433f8a4423969afed9d2eada3ec212edd\": container with ID starting with 63513425fce979b966a29e14fcd4621433f8a4423969afed9d2eada3ec212edd not found: ID does not exist" Apr 24 14:27:56.205978 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.205936 2570 scope.go:117] "RemoveContainer" containerID="e9c5d294308ae8db07f3e1e13b1ae329f2704ecdf2ff341de4769643ee6a2a10" Apr 24 14:27:56.206166 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.206150 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9c5d294308ae8db07f3e1e13b1ae329f2704ecdf2ff341de4769643ee6a2a10"} err="failed to get container status \"e9c5d294308ae8db07f3e1e13b1ae329f2704ecdf2ff341de4769643ee6a2a10\": rpc error: code = NotFound desc = could not find container \"e9c5d294308ae8db07f3e1e13b1ae329f2704ecdf2ff341de4769643ee6a2a10\": container with ID starting with e9c5d294308ae8db07f3e1e13b1ae329f2704ecdf2ff341de4769643ee6a2a10 not found: ID does not exist" Apr 24 14:27:56.206206 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.206167 2570 scope.go:117] "RemoveContainer" containerID="9251d4cee610728a5518fa4869c2222e5399be14a3cc75914a2a0410aa933fef" Apr 24 14:27:56.206361 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.206346 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9251d4cee610728a5518fa4869c2222e5399be14a3cc75914a2a0410aa933fef"} err="failed to get container status \"9251d4cee610728a5518fa4869c2222e5399be14a3cc75914a2a0410aa933fef\": rpc error: code = NotFound desc = could not find container \"9251d4cee610728a5518fa4869c2222e5399be14a3cc75914a2a0410aa933fef\": container with ID starting with 9251d4cee610728a5518fa4869c2222e5399be14a3cc75914a2a0410aa933fef not found: ID does not exist" Apr 24 14:27:56.206398 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.206361 2570 scope.go:117] "RemoveContainer" containerID="9a3ac3296f4f3909a7a3ccac7d907090987ce2ce6eb768e1ca92e0e2abb9da23" Apr 24 14:27:56.206569 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.206552 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a3ac3296f4f3909a7a3ccac7d907090987ce2ce6eb768e1ca92e0e2abb9da23"} err="failed to get container status \"9a3ac3296f4f3909a7a3ccac7d907090987ce2ce6eb768e1ca92e0e2abb9da23\": rpc error: code = NotFound desc = could not find container \"9a3ac3296f4f3909a7a3ccac7d907090987ce2ce6eb768e1ca92e0e2abb9da23\": container with ID starting with 9a3ac3296f4f3909a7a3ccac7d907090987ce2ce6eb768e1ca92e0e2abb9da23 not found: ID does not exist" Apr 24 14:27:56.206608 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.206569 2570 scope.go:117] "RemoveContainer" containerID="c8b68c168915dac65203f96fc1ab902d93426848626db45586344e35c266202c" Apr 24 14:27:56.206799 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.206781 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8b68c168915dac65203f96fc1ab902d93426848626db45586344e35c266202c"} err="failed to get container status \"c8b68c168915dac65203f96fc1ab902d93426848626db45586344e35c266202c\": rpc error: code = NotFound desc = could not find container \"c8b68c168915dac65203f96fc1ab902d93426848626db45586344e35c266202c\": container with ID starting with c8b68c168915dac65203f96fc1ab902d93426848626db45586344e35c266202c not found: ID does not exist" Apr 24 14:27:56.206843 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.206801 2570 scope.go:117] "RemoveContainer" containerID="ae5c0a0f5eb4f1f3247df121feb09bb109915a2afa1bb86797e6fd4a1eb84594" Apr 24 14:27:56.206964 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.206950 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae5c0a0f5eb4f1f3247df121feb09bb109915a2afa1bb86797e6fd4a1eb84594"} err="failed to get container status \"ae5c0a0f5eb4f1f3247df121feb09bb109915a2afa1bb86797e6fd4a1eb84594\": rpc error: code = NotFound desc = could not find container \"ae5c0a0f5eb4f1f3247df121feb09bb109915a2afa1bb86797e6fd4a1eb84594\": container with ID starting with ae5c0a0f5eb4f1f3247df121feb09bb109915a2afa1bb86797e6fd4a1eb84594 not found: ID does not exist" Apr 24 14:27:56.206964 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.206963 2570 scope.go:117] "RemoveContainer" containerID="4ca90c1c3f6cb2d00aa64e2eaf73082b812bab8208659214d27a74944077314d" Apr 24 14:27:56.207177 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.207162 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ca90c1c3f6cb2d00aa64e2eaf73082b812bab8208659214d27a74944077314d"} err="failed to get container status \"4ca90c1c3f6cb2d00aa64e2eaf73082b812bab8208659214d27a74944077314d\": rpc error: code = NotFound desc = could not find container \"4ca90c1c3f6cb2d00aa64e2eaf73082b812bab8208659214d27a74944077314d\": container with ID starting with 4ca90c1c3f6cb2d00aa64e2eaf73082b812bab8208659214d27a74944077314d not found: ID does not exist" Apr 24 14:27:56.207263 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.207178 2570 scope.go:117] "RemoveContainer" containerID="63513425fce979b966a29e14fcd4621433f8a4423969afed9d2eada3ec212edd" Apr 24 14:27:56.207328 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.207310 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63513425fce979b966a29e14fcd4621433f8a4423969afed9d2eada3ec212edd"} err="failed to get container status \"63513425fce979b966a29e14fcd4621433f8a4423969afed9d2eada3ec212edd\": rpc error: code = NotFound desc = could not find container \"63513425fce979b966a29e14fcd4621433f8a4423969afed9d2eada3ec212edd\": container with ID starting with 63513425fce979b966a29e14fcd4621433f8a4423969afed9d2eada3ec212edd not found: ID does not exist" Apr 24 14:27:56.207366 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.207330 2570 scope.go:117] "RemoveContainer" containerID="e9c5d294308ae8db07f3e1e13b1ae329f2704ecdf2ff341de4769643ee6a2a10" Apr 24 14:27:56.207502 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.207487 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9c5d294308ae8db07f3e1e13b1ae329f2704ecdf2ff341de4769643ee6a2a10"} err="failed to get container status \"e9c5d294308ae8db07f3e1e13b1ae329f2704ecdf2ff341de4769643ee6a2a10\": rpc error: code = NotFound desc = could not find container \"e9c5d294308ae8db07f3e1e13b1ae329f2704ecdf2ff341de4769643ee6a2a10\": container with ID starting with e9c5d294308ae8db07f3e1e13b1ae329f2704ecdf2ff341de4769643ee6a2a10 not found: ID does not exist" Apr 24 14:27:56.207502 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.207501 2570 scope.go:117] "RemoveContainer" containerID="9251d4cee610728a5518fa4869c2222e5399be14a3cc75914a2a0410aa933fef" Apr 24 14:27:56.207729 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.207711 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9251d4cee610728a5518fa4869c2222e5399be14a3cc75914a2a0410aa933fef"} err="failed to get container status \"9251d4cee610728a5518fa4869c2222e5399be14a3cc75914a2a0410aa933fef\": rpc error: code = NotFound desc = could not find container \"9251d4cee610728a5518fa4869c2222e5399be14a3cc75914a2a0410aa933fef\": container with ID starting with 9251d4cee610728a5518fa4869c2222e5399be14a3cc75914a2a0410aa933fef not found: ID does not exist" Apr 24 14:27:56.207792 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.207730 2570 scope.go:117] "RemoveContainer" containerID="9a3ac3296f4f3909a7a3ccac7d907090987ce2ce6eb768e1ca92e0e2abb9da23" Apr 24 14:27:56.207930 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.207912 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a3ac3296f4f3909a7a3ccac7d907090987ce2ce6eb768e1ca92e0e2abb9da23"} err="failed to get container status \"9a3ac3296f4f3909a7a3ccac7d907090987ce2ce6eb768e1ca92e0e2abb9da23\": rpc error: code = NotFound desc = could not find container \"9a3ac3296f4f3909a7a3ccac7d907090987ce2ce6eb768e1ca92e0e2abb9da23\": container with ID starting with 9a3ac3296f4f3909a7a3ccac7d907090987ce2ce6eb768e1ca92e0e2abb9da23 not found: ID does not exist" Apr 24 14:27:56.207930 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.207929 2570 scope.go:117] "RemoveContainer" containerID="c8b68c168915dac65203f96fc1ab902d93426848626db45586344e35c266202c" Apr 24 14:27:56.208142 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.208125 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8b68c168915dac65203f96fc1ab902d93426848626db45586344e35c266202c"} err="failed to get container status \"c8b68c168915dac65203f96fc1ab902d93426848626db45586344e35c266202c\": rpc error: code = NotFound desc = could not find container \"c8b68c168915dac65203f96fc1ab902d93426848626db45586344e35c266202c\": container with ID starting with c8b68c168915dac65203f96fc1ab902d93426848626db45586344e35c266202c not found: ID does not exist" Apr 24 14:27:56.208182 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.208143 2570 scope.go:117] "RemoveContainer" containerID="ae5c0a0f5eb4f1f3247df121feb09bb109915a2afa1bb86797e6fd4a1eb84594" Apr 24 14:27:56.208329 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.208312 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae5c0a0f5eb4f1f3247df121feb09bb109915a2afa1bb86797e6fd4a1eb84594"} err="failed to get container status \"ae5c0a0f5eb4f1f3247df121feb09bb109915a2afa1bb86797e6fd4a1eb84594\": rpc error: code = NotFound desc = could not find container \"ae5c0a0f5eb4f1f3247df121feb09bb109915a2afa1bb86797e6fd4a1eb84594\": container with ID starting with ae5c0a0f5eb4f1f3247df121feb09bb109915a2afa1bb86797e6fd4a1eb84594 not found: ID does not exist" Apr 24 14:27:56.208376 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.208329 2570 scope.go:117] "RemoveContainer" containerID="4ca90c1c3f6cb2d00aa64e2eaf73082b812bab8208659214d27a74944077314d" Apr 24 14:27:56.208521 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.208504 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ca90c1c3f6cb2d00aa64e2eaf73082b812bab8208659214d27a74944077314d"} err="failed to get container status \"4ca90c1c3f6cb2d00aa64e2eaf73082b812bab8208659214d27a74944077314d\": rpc error: code = NotFound desc = could not find container \"4ca90c1c3f6cb2d00aa64e2eaf73082b812bab8208659214d27a74944077314d\": container with ID starting with 4ca90c1c3f6cb2d00aa64e2eaf73082b812bab8208659214d27a74944077314d not found: ID does not exist" Apr 24 14:27:56.208562 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.208521 2570 scope.go:117] "RemoveContainer" containerID="63513425fce979b966a29e14fcd4621433f8a4423969afed9d2eada3ec212edd" Apr 24 14:27:56.208761 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.208743 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63513425fce979b966a29e14fcd4621433f8a4423969afed9d2eada3ec212edd"} err="failed to get container status \"63513425fce979b966a29e14fcd4621433f8a4423969afed9d2eada3ec212edd\": rpc error: code = NotFound desc = could not find container \"63513425fce979b966a29e14fcd4621433f8a4423969afed9d2eada3ec212edd\": container with ID starting with 63513425fce979b966a29e14fcd4621433f8a4423969afed9d2eada3ec212edd not found: ID does not exist" Apr 24 14:27:56.208819 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.208762 2570 scope.go:117] "RemoveContainer" containerID="e9c5d294308ae8db07f3e1e13b1ae329f2704ecdf2ff341de4769643ee6a2a10" Apr 24 14:27:56.208974 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.208959 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9c5d294308ae8db07f3e1e13b1ae329f2704ecdf2ff341de4769643ee6a2a10"} err="failed to get container status \"e9c5d294308ae8db07f3e1e13b1ae329f2704ecdf2ff341de4769643ee6a2a10\": rpc error: code = NotFound desc = could not find container \"e9c5d294308ae8db07f3e1e13b1ae329f2704ecdf2ff341de4769643ee6a2a10\": container with ID starting with e9c5d294308ae8db07f3e1e13b1ae329f2704ecdf2ff341de4769643ee6a2a10 not found: ID does not exist" Apr 24 14:27:56.209016 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.208975 2570 scope.go:117] "RemoveContainer" containerID="9251d4cee610728a5518fa4869c2222e5399be14a3cc75914a2a0410aa933fef" Apr 24 14:27:56.209173 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.209157 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9251d4cee610728a5518fa4869c2222e5399be14a3cc75914a2a0410aa933fef"} err="failed to get container status \"9251d4cee610728a5518fa4869c2222e5399be14a3cc75914a2a0410aa933fef\": rpc error: code = NotFound desc = could not find container \"9251d4cee610728a5518fa4869c2222e5399be14a3cc75914a2a0410aa933fef\": container with ID starting with 9251d4cee610728a5518fa4869c2222e5399be14a3cc75914a2a0410aa933fef not found: ID does not exist" Apr 24 14:27:56.209222 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.209174 2570 scope.go:117] "RemoveContainer" containerID="9a3ac3296f4f3909a7a3ccac7d907090987ce2ce6eb768e1ca92e0e2abb9da23" Apr 24 14:27:56.209378 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.209361 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a3ac3296f4f3909a7a3ccac7d907090987ce2ce6eb768e1ca92e0e2abb9da23"} err="failed to get container status \"9a3ac3296f4f3909a7a3ccac7d907090987ce2ce6eb768e1ca92e0e2abb9da23\": rpc error: code = NotFound desc = could not find container \"9a3ac3296f4f3909a7a3ccac7d907090987ce2ce6eb768e1ca92e0e2abb9da23\": container with ID starting with 9a3ac3296f4f3909a7a3ccac7d907090987ce2ce6eb768e1ca92e0e2abb9da23 not found: ID does not exist" Apr 24 14:27:56.209426 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.209379 2570 scope.go:117] "RemoveContainer" containerID="c8b68c168915dac65203f96fc1ab902d93426848626db45586344e35c266202c" Apr 24 14:27:56.209587 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.209570 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8b68c168915dac65203f96fc1ab902d93426848626db45586344e35c266202c"} err="failed to get container status \"c8b68c168915dac65203f96fc1ab902d93426848626db45586344e35c266202c\": rpc error: code = NotFound desc = could not find container \"c8b68c168915dac65203f96fc1ab902d93426848626db45586344e35c266202c\": container with ID starting with c8b68c168915dac65203f96fc1ab902d93426848626db45586344e35c266202c not found: ID does not exist" Apr 24 14:27:56.209714 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.209588 2570 scope.go:117] "RemoveContainer" containerID="ae5c0a0f5eb4f1f3247df121feb09bb109915a2afa1bb86797e6fd4a1eb84594" Apr 24 14:27:56.209823 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.209806 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae5c0a0f5eb4f1f3247df121feb09bb109915a2afa1bb86797e6fd4a1eb84594"} err="failed to get container status \"ae5c0a0f5eb4f1f3247df121feb09bb109915a2afa1bb86797e6fd4a1eb84594\": rpc error: code = NotFound desc = could not find container \"ae5c0a0f5eb4f1f3247df121feb09bb109915a2afa1bb86797e6fd4a1eb84594\": container with ID starting with ae5c0a0f5eb4f1f3247df121feb09bb109915a2afa1bb86797e6fd4a1eb84594 not found: ID does not exist" Apr 24 14:27:56.212673 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.212655 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 14:27:56.212961 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.212950 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="98168d07-827e-4cd1-8fda-2885b13bff36" containerName="prometheus" Apr 24 14:27:56.213011 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.212963 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="98168d07-827e-4cd1-8fda-2885b13bff36" containerName="prometheus" Apr 24 14:27:56.213011 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.212973 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="98168d07-827e-4cd1-8fda-2885b13bff36" containerName="kube-rbac-proxy-thanos" Apr 24 14:27:56.213011 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.212978 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="98168d07-827e-4cd1-8fda-2885b13bff36" containerName="kube-rbac-proxy-thanos" Apr 24 14:27:56.213011 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.212989 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="98168d07-827e-4cd1-8fda-2885b13bff36" containerName="kube-rbac-proxy-web" Apr 24 14:27:56.213011 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.212994 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="98168d07-827e-4cd1-8fda-2885b13bff36" containerName="kube-rbac-proxy-web" Apr 24 14:27:56.213011 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.213001 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="98168d07-827e-4cd1-8fda-2885b13bff36" containerName="init-config-reloader" Apr 24 14:27:56.213011 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.213007 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="98168d07-827e-4cd1-8fda-2885b13bff36" containerName="init-config-reloader" Apr 24 14:27:56.213205 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.213016 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="98168d07-827e-4cd1-8fda-2885b13bff36" containerName="config-reloader" Apr 24 14:27:56.213205 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.213020 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="98168d07-827e-4cd1-8fda-2885b13bff36" containerName="config-reloader" Apr 24 14:27:56.213205 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.213033 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="98168d07-827e-4cd1-8fda-2885b13bff36" containerName="thanos-sidecar" Apr 24 14:27:56.213205 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.213046 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="98168d07-827e-4cd1-8fda-2885b13bff36" containerName="thanos-sidecar" Apr 24 14:27:56.213205 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.213052 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="98168d07-827e-4cd1-8fda-2885b13bff36" containerName="kube-rbac-proxy" Apr 24 14:27:56.213205 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.213059 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="98168d07-827e-4cd1-8fda-2885b13bff36" containerName="kube-rbac-proxy" Apr 24 14:27:56.213205 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.213105 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="98168d07-827e-4cd1-8fda-2885b13bff36" containerName="kube-rbac-proxy-web" Apr 24 14:27:56.213205 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.213112 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="98168d07-827e-4cd1-8fda-2885b13bff36" containerName="kube-rbac-proxy" Apr 24 14:27:56.213205 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.213118 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="98168d07-827e-4cd1-8fda-2885b13bff36" containerName="prometheus" Apr 24 14:27:56.213205 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.213124 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="98168d07-827e-4cd1-8fda-2885b13bff36" containerName="config-reloader" Apr 24 14:27:56.213205 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.213130 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="98168d07-827e-4cd1-8fda-2885b13bff36" containerName="kube-rbac-proxy-thanos" Apr 24 14:27:56.213205 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.213137 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="98168d07-827e-4cd1-8fda-2885b13bff36" containerName="thanos-sidecar" Apr 24 14:27:56.218165 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.218150 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.221221 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.221194 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 14:27:56.221310 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.221271 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-9osrbbpqd7ude\"" Apr 24 14:27:56.221372 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.221315 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 14:27:56.221372 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.221358 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 14:27:56.221474 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.221411 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 14:27:56.221546 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.221532 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 14:27:56.221604 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.221550 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 14:27:56.221720 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.221705 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-dzlmh\"" Apr 24 14:27:56.221768 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.221743 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 14:27:56.221813 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.221770 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 14:27:56.221869 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.221854 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 14:27:56.222335 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.222319 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 14:27:56.225216 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.225164 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 14:27:56.227173 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.227155 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 14:27:56.229525 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.229506 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 14:27:56.241942 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.241921 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.242045 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.241946 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.242045 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.241983 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-config\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.242045 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.241997 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.242198 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.242056 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.242198 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.242100 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.242198 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.242132 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.242198 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.242159 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.242388 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.242201 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.242388 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.242239 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-config-out\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.242388 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.242263 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.242388 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.242293 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.242388 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.242319 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.242388 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.242381 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.242667 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.242431 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.242667 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.242463 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.242667 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.242494 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x6fb\" (UniqueName: \"kubernetes.io/projected/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-kube-api-access-8x6fb\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.242667 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.242519 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-web-config\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.343402 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.343361 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.343576 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.343410 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-config-out\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.343576 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.343430 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.343576 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.343446 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.343576 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.343462 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.343576 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.343482 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.343576 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.343518 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.343576 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.343546 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.343966 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.343581 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8x6fb\" (UniqueName: \"kubernetes.io/projected/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-kube-api-access-8x6fb\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.343966 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.343607 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-web-config\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.343966 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.343674 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.343966 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.343705 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.343966 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.343766 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-config\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.343966 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.343789 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.343966 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.343818 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.343966 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.343845 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.343966 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.343879 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.343966 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.343906 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.344442 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.343975 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.344442 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.344371 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.344969 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.344939 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.346388 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.346343 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.346990 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.346820 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.346990 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.346844 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.346990 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.346863 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.346990 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.346979 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.347787 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.347738 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.348250 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.348213 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-config-out\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.348886 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.348846 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.349148 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.349116 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-web-config\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.349454 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.349433 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.349555 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.349454 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.349838 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.349776 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.349968 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.349951 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-config\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.350325 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.350304 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.354009 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.353986 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x6fb\" (UniqueName: \"kubernetes.io/projected/753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315-kube-api-access-8x6fb\") pod \"prometheus-k8s-0\" (UID: \"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.417799 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.417766 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98168d07-827e-4cd1-8fda-2885b13bff36" path="/var/lib/kubelet/pods/98168d07-827e-4cd1-8fda-2885b13bff36/volumes" Apr 24 14:27:56.527893 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.527841 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:27:56.658203 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:56.658174 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 14:27:56.661643 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:27:56.661596 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod753d6b5b_fb2e_42e1_aeb1_ce7b23ca0315.slice/crio-13464bbdf9ac94a11219ad89106f9053316f23f6d858f06d788a9adf4dec988c WatchSource:0}: Error finding container 13464bbdf9ac94a11219ad89106f9053316f23f6d858f06d788a9adf4dec988c: Status 404 returned error can't find the container with id 13464bbdf9ac94a11219ad89106f9053316f23f6d858f06d788a9adf4dec988c Apr 24 14:27:57.152869 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:57.152837 2570 generic.go:358] "Generic (PLEG): container finished" podID="753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315" containerID="eba7d42edd0918644d64172fabd4ef93a04c51966bd6395cedc58b0ce89a761c" exitCode=0 Apr 24 14:27:57.153324 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:57.152928 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315","Type":"ContainerDied","Data":"eba7d42edd0918644d64172fabd4ef93a04c51966bd6395cedc58b0ce89a761c"} Apr 24 14:27:57.153324 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:57.152965 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315","Type":"ContainerStarted","Data":"13464bbdf9ac94a11219ad89106f9053316f23f6d858f06d788a9adf4dec988c"} Apr 24 14:27:58.160021 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:58.159976 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315","Type":"ContainerStarted","Data":"c3742fb7d0f3209ccd9d7fccebced5ddfceb3576cda7ffe1637fddf8ca709e1f"} Apr 24 14:27:58.160021 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:58.160018 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315","Type":"ContainerStarted","Data":"180da93ee3b95b0462afe4ee4eee91c77b587ee33a0381be70ba9c6a0acebb9f"} Apr 24 14:27:58.160021 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:58.160031 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315","Type":"ContainerStarted","Data":"774e0e8a0b33fd39d26975bd065eb997ce9a2b6df21248f9b5b08a8836f6abfa"} Apr 24 14:27:58.160480 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:58.160041 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315","Type":"ContainerStarted","Data":"9651fc350b89298a84fec047bd6ac5578f4fbb4ecc076d7bd2990e38a4ab8e4d"} Apr 24 14:27:58.160480 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:58.160054 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315","Type":"ContainerStarted","Data":"9e2d874f1e293c60cdfb9b8c2dc25f11ab83294893a669e18b89e85f35e78df0"} Apr 24 14:27:58.160480 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:58.160067 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315","Type":"ContainerStarted","Data":"5254d5b2e8764e37218f4db63c30de4f95b90863bdf62191e1d0737f1afdc631"} Apr 24 14:27:58.188861 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:27:58.188820 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.188804905 podStartE2EDuration="2.188804905s" podCreationTimestamp="2026-04-24 14:27:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:27:58.186780317 +0000 UTC m=+246.312656045" watchObservedRunningTime="2026-04-24 14:27:58.188804905 +0000 UTC m=+246.314680648" Apr 24 14:28:01.528761 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:28:01.528723 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:04.312538 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:28:04.312493 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90958440-ae13-4f74-8dc0-73b738f79139-metrics-certs\") pod \"network-metrics-daemon-f5bf4\" (UID: \"90958440-ae13-4f74-8dc0-73b738f79139\") " pod="openshift-multus/network-metrics-daemon-f5bf4" Apr 24 14:28:04.315102 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:28:04.315075 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90958440-ae13-4f74-8dc0-73b738f79139-metrics-certs\") pod \"network-metrics-daemon-f5bf4\" (UID: \"90958440-ae13-4f74-8dc0-73b738f79139\") " pod="openshift-multus/network-metrics-daemon-f5bf4" Apr 24 14:28:04.417704 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:28:04.417671 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-68j6s\"" Apr 24 14:28:04.425110 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:28:04.425084 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f5bf4" Apr 24 14:28:04.547176 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:28:04.547082 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-f5bf4"] Apr 24 14:28:04.550181 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:28:04.550139 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90958440_ae13_4f74_8dc0_73b738f79139.slice/crio-709ea939b5b6f679021ddfb3ec6c60ac3f46f7a972dd4c6118c6bcdc09416e5c WatchSource:0}: Error finding container 709ea939b5b6f679021ddfb3ec6c60ac3f46f7a972dd4c6118c6bcdc09416e5c: Status 404 returned error can't find the container with id 709ea939b5b6f679021ddfb3ec6c60ac3f46f7a972dd4c6118c6bcdc09416e5c Apr 24 14:28:05.185987 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:28:05.185936 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-f5bf4" event={"ID":"90958440-ae13-4f74-8dc0-73b738f79139","Type":"ContainerStarted","Data":"709ea939b5b6f679021ddfb3ec6c60ac3f46f7a972dd4c6118c6bcdc09416e5c"} Apr 24 14:28:06.189946 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:28:06.189912 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-f5bf4" event={"ID":"90958440-ae13-4f74-8dc0-73b738f79139","Type":"ContainerStarted","Data":"517bc81168430a1a99595a506d2a92126a21a3582aca39a2af0a6c5e18361d33"} Apr 24 14:28:06.189946 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:28:06.189949 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-f5bf4" event={"ID":"90958440-ae13-4f74-8dc0-73b738f79139","Type":"ContainerStarted","Data":"d6463ef1ea4d862fca85a71a765c5940c548819b84667e68b471c7d9a6ae8232"} Apr 24 14:28:06.208477 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:28:06.208424 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-f5bf4" podStartSLOduration=253.205732291 podStartE2EDuration="4m14.208410086s" podCreationTimestamp="2026-04-24 14:23:52 +0000 UTC" firstStartedPulling="2026-04-24 14:28:04.552015729 +0000 UTC m=+252.677891415" lastFinishedPulling="2026-04-24 14:28:05.554693527 +0000 UTC m=+253.680569210" observedRunningTime="2026-04-24 14:28:06.206184226 +0000 UTC m=+254.332059931" watchObservedRunningTime="2026-04-24 14:28:06.208410086 +0000 UTC m=+254.334285791" Apr 24 14:28:52.291171 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:28:52.291124 2570 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 14:28:56.528808 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:28:56.528768 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:56.544886 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:28:56.544860 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:28:57.358103 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:28:57.358074 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 14:32:10.665466 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:32:10.665432 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-nf5lr"] Apr 24 14:32:10.668647 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:32:10.668609 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-nf5lr" Apr 24 14:32:10.671371 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:32:10.671346 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 14:32:10.671505 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:32:10.671436 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 14:32:10.672567 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:32:10.672544 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-rhgm5\"" Apr 24 14:32:10.672685 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:32:10.672564 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 14:32:10.676918 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:32:10.676894 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-nf5lr"] Apr 24 14:32:10.797126 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:32:10.797089 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz4nv\" (UniqueName: \"kubernetes.io/projected/8f0e4c32-689a-4060-beba-ab06014cf676-kube-api-access-gz4nv\") pod \"s3-init-nf5lr\" (UID: \"8f0e4c32-689a-4060-beba-ab06014cf676\") " pod="kserve/s3-init-nf5lr" Apr 24 14:32:10.897665 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:32:10.897611 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gz4nv\" (UniqueName: \"kubernetes.io/projected/8f0e4c32-689a-4060-beba-ab06014cf676-kube-api-access-gz4nv\") pod \"s3-init-nf5lr\" (UID: \"8f0e4c32-689a-4060-beba-ab06014cf676\") " pod="kserve/s3-init-nf5lr" Apr 24 14:32:10.906669 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:32:10.906615 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz4nv\" (UniqueName: \"kubernetes.io/projected/8f0e4c32-689a-4060-beba-ab06014cf676-kube-api-access-gz4nv\") pod \"s3-init-nf5lr\" (UID: \"8f0e4c32-689a-4060-beba-ab06014cf676\") " pod="kserve/s3-init-nf5lr" Apr 24 14:32:10.990369 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:32:10.990285 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-nf5lr" Apr 24 14:32:11.109134 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:32:11.109101 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-nf5lr"] Apr 24 14:32:11.112593 ip-10-0-129-77 kubenswrapper[2570]: W0424 14:32:11.112565 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f0e4c32_689a_4060_beba_ab06014cf676.slice/crio-d9fb468a15e65ff997cc5fa94b9444a6f60b174d30604fa7f74a7f1ce122bcf0 WatchSource:0}: Error finding container d9fb468a15e65ff997cc5fa94b9444a6f60b174d30604fa7f74a7f1ce122bcf0: Status 404 returned error can't find the container with id d9fb468a15e65ff997cc5fa94b9444a6f60b174d30604fa7f74a7f1ce122bcf0 Apr 24 14:32:11.114263 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:32:11.114247 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 14:32:11.925706 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:32:11.925664 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-nf5lr" event={"ID":"8f0e4c32-689a-4060-beba-ab06014cf676","Type":"ContainerStarted","Data":"d9fb468a15e65ff997cc5fa94b9444a6f60b174d30604fa7f74a7f1ce122bcf0"} Apr 24 14:32:15.938476 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:32:15.938438 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-nf5lr" event={"ID":"8f0e4c32-689a-4060-beba-ab06014cf676","Type":"ContainerStarted","Data":"d4fe66d29d8cba447a3de134f9bed9e722681d67ce056572baa6bf36cc7f9ee4"} Apr 24 14:32:15.956850 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:32:15.956800 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-nf5lr" podStartSLOduration=1.488426984 podStartE2EDuration="5.956784431s" podCreationTimestamp="2026-04-24 14:32:10 +0000 UTC" firstStartedPulling="2026-04-24 14:32:11.114365593 +0000 UTC m=+499.240241276" lastFinishedPulling="2026-04-24 14:32:15.582723025 +0000 UTC m=+503.708598723" observedRunningTime="2026-04-24 14:32:15.955154855 +0000 UTC m=+504.081030561" watchObservedRunningTime="2026-04-24 14:32:15.956784431 +0000 UTC m=+504.082660137" Apr 24 14:32:18.948760 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:32:18.948674 2570 generic.go:358] "Generic (PLEG): container finished" podID="8f0e4c32-689a-4060-beba-ab06014cf676" containerID="d4fe66d29d8cba447a3de134f9bed9e722681d67ce056572baa6bf36cc7f9ee4" exitCode=0 Apr 24 14:32:18.949171 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:32:18.948752 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-nf5lr" event={"ID":"8f0e4c32-689a-4060-beba-ab06014cf676","Type":"ContainerDied","Data":"d4fe66d29d8cba447a3de134f9bed9e722681d67ce056572baa6bf36cc7f9ee4"} Apr 24 14:32:20.078179 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:32:20.078158 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-nf5lr" Apr 24 14:32:20.178657 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:32:20.178590 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gz4nv\" (UniqueName: \"kubernetes.io/projected/8f0e4c32-689a-4060-beba-ab06014cf676-kube-api-access-gz4nv\") pod \"8f0e4c32-689a-4060-beba-ab06014cf676\" (UID: \"8f0e4c32-689a-4060-beba-ab06014cf676\") " Apr 24 14:32:20.180915 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:32:20.180886 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f0e4c32-689a-4060-beba-ab06014cf676-kube-api-access-gz4nv" (OuterVolumeSpecName: "kube-api-access-gz4nv") pod "8f0e4c32-689a-4060-beba-ab06014cf676" (UID: "8f0e4c32-689a-4060-beba-ab06014cf676"). InnerVolumeSpecName "kube-api-access-gz4nv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:32:20.279526 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:32:20.279447 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gz4nv\" (UniqueName: \"kubernetes.io/projected/8f0e4c32-689a-4060-beba-ab06014cf676-kube-api-access-gz4nv\") on node \"ip-10-0-129-77.ec2.internal\" DevicePath \"\"" Apr 24 14:32:20.955707 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:32:20.955671 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-nf5lr" event={"ID":"8f0e4c32-689a-4060-beba-ab06014cf676","Type":"ContainerDied","Data":"d9fb468a15e65ff997cc5fa94b9444a6f60b174d30604fa7f74a7f1ce122bcf0"} Apr 24 14:32:20.955707 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:32:20.955692 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-nf5lr" Apr 24 14:32:20.955707 ip-10-0-129-77 kubenswrapper[2570]: I0424 14:32:20.955703 2570 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9fb468a15e65ff997cc5fa94b9444a6f60b174d30604fa7f74a7f1ce122bcf0" Apr 24 15:12:32.314146 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:32.314106 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-m85mj/must-gather-bncdl"] Apr 24 15:12:32.316405 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:32.314492 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f0e4c32-689a-4060-beba-ab06014cf676" containerName="s3-init" Apr 24 15:12:32.316405 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:32.314506 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f0e4c32-689a-4060-beba-ab06014cf676" containerName="s3-init" Apr 24 15:12:32.316405 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:32.314570 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f0e4c32-689a-4060-beba-ab06014cf676" containerName="s3-init" Apr 24 15:12:32.317423 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:32.317407 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m85mj/must-gather-bncdl" Apr 24 15:12:32.320107 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:32.320083 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-m85mj\"/\"default-dockercfg-t2hh5\"" Apr 24 15:12:32.320243 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:32.320110 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-m85mj\"/\"openshift-service-ca.crt\"" Apr 24 15:12:32.320243 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:32.320134 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-m85mj\"/\"kube-root-ca.crt\"" Apr 24 15:12:32.332936 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:32.332910 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-m85mj/must-gather-bncdl"] Apr 24 15:12:32.424005 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:32.423977 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8329532f-e59a-4a80-b537-66ea1ca2252e-must-gather-output\") pod \"must-gather-bncdl\" (UID: \"8329532f-e59a-4a80-b537-66ea1ca2252e\") " pod="openshift-must-gather-m85mj/must-gather-bncdl" Apr 24 15:12:32.424188 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:32.424025 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-767xr\" (UniqueName: \"kubernetes.io/projected/8329532f-e59a-4a80-b537-66ea1ca2252e-kube-api-access-767xr\") pod \"must-gather-bncdl\" (UID: \"8329532f-e59a-4a80-b537-66ea1ca2252e\") " pod="openshift-must-gather-m85mj/must-gather-bncdl" Apr 24 15:12:32.525074 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:32.525037 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-767xr\" (UniqueName: \"kubernetes.io/projected/8329532f-e59a-4a80-b537-66ea1ca2252e-kube-api-access-767xr\") pod \"must-gather-bncdl\" (UID: \"8329532f-e59a-4a80-b537-66ea1ca2252e\") " pod="openshift-must-gather-m85mj/must-gather-bncdl" Apr 24 15:12:32.525265 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:32.525127 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8329532f-e59a-4a80-b537-66ea1ca2252e-must-gather-output\") pod \"must-gather-bncdl\" (UID: \"8329532f-e59a-4a80-b537-66ea1ca2252e\") " pod="openshift-must-gather-m85mj/must-gather-bncdl" Apr 24 15:12:32.525419 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:32.525404 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8329532f-e59a-4a80-b537-66ea1ca2252e-must-gather-output\") pod \"must-gather-bncdl\" (UID: \"8329532f-e59a-4a80-b537-66ea1ca2252e\") " pod="openshift-must-gather-m85mj/must-gather-bncdl" Apr 24 15:12:32.533861 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:32.533838 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-767xr\" (UniqueName: \"kubernetes.io/projected/8329532f-e59a-4a80-b537-66ea1ca2252e-kube-api-access-767xr\") pod \"must-gather-bncdl\" (UID: \"8329532f-e59a-4a80-b537-66ea1ca2252e\") " pod="openshift-must-gather-m85mj/must-gather-bncdl" Apr 24 15:12:32.641469 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:32.641434 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m85mj/must-gather-bncdl" Apr 24 15:12:32.762164 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:32.762139 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-m85mj/must-gather-bncdl"] Apr 24 15:12:32.764782 ip-10-0-129-77 kubenswrapper[2570]: W0424 15:12:32.764751 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8329532f_e59a_4a80_b537_66ea1ca2252e.slice/crio-f77d91da0c72996fb0dd4345caa5fc18b25b3a6f58828108e1246b48120e3398 WatchSource:0}: Error finding container f77d91da0c72996fb0dd4345caa5fc18b25b3a6f58828108e1246b48120e3398: Status 404 returned error can't find the container with id f77d91da0c72996fb0dd4345caa5fc18b25b3a6f58828108e1246b48120e3398 Apr 24 15:12:32.766440 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:32.766424 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 15:12:32.984391 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:32.984311 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m85mj/must-gather-bncdl" event={"ID":"8329532f-e59a-4a80-b537-66ea1ca2252e","Type":"ContainerStarted","Data":"f77d91da0c72996fb0dd4345caa5fc18b25b3a6f58828108e1246b48120e3398"} Apr 24 15:12:38.002046 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:38.002002 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m85mj/must-gather-bncdl" event={"ID":"8329532f-e59a-4a80-b537-66ea1ca2252e","Type":"ContainerStarted","Data":"e4496c265b6e1b6e322bbe3dc8cc427eb907659b97ba9a7896084cabedda2b61"} Apr 24 15:12:39.007692 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:39.007655 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m85mj/must-gather-bncdl" event={"ID":"8329532f-e59a-4a80-b537-66ea1ca2252e","Type":"ContainerStarted","Data":"1b5fab6627d4ac0191825f093f4e63365c10ea7d29c45607e1b24c9c145be9db"} Apr 24 15:12:39.023752 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:39.023698 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-m85mj/must-gather-bncdl" podStartSLOduration=1.960706622 podStartE2EDuration="7.023684512s" podCreationTimestamp="2026-04-24 15:12:32 +0000 UTC" firstStartedPulling="2026-04-24 15:12:32.766555028 +0000 UTC m=+2920.892430713" lastFinishedPulling="2026-04-24 15:12:37.829532917 +0000 UTC m=+2925.955408603" observedRunningTime="2026-04-24 15:12:39.022917955 +0000 UTC m=+2927.148793661" watchObservedRunningTime="2026-04-24 15:12:39.023684512 +0000 UTC m=+2927.149560217" Apr 24 15:12:56.062671 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:56.062606 2570 generic.go:358] "Generic (PLEG): container finished" podID="8329532f-e59a-4a80-b537-66ea1ca2252e" containerID="e4496c265b6e1b6e322bbe3dc8cc427eb907659b97ba9a7896084cabedda2b61" exitCode=0 Apr 24 15:12:56.062671 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:56.062670 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m85mj/must-gather-bncdl" event={"ID":"8329532f-e59a-4a80-b537-66ea1ca2252e","Type":"ContainerDied","Data":"e4496c265b6e1b6e322bbe3dc8cc427eb907659b97ba9a7896084cabedda2b61"} Apr 24 15:12:56.063105 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:56.062992 2570 scope.go:117] "RemoveContainer" containerID="e4496c265b6e1b6e322bbe3dc8cc427eb907659b97ba9a7896084cabedda2b61" Apr 24 15:12:56.587579 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:56.587544 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-m85mj_must-gather-bncdl_8329532f-e59a-4a80-b537-66ea1ca2252e/gather/0.log" Apr 24 15:12:57.120837 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:57.120800 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8f8q8/must-gather-dkrz8"] Apr 24 15:12:57.123822 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:57.123804 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8f8q8/must-gather-dkrz8" Apr 24 15:12:57.126586 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:57.126560 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8f8q8\"/\"openshift-service-ca.crt\"" Apr 24 15:12:57.126722 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:57.126564 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8f8q8\"/\"kube-root-ca.crt\"" Apr 24 15:12:57.127539 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:57.127522 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-8f8q8\"/\"default-dockercfg-p8tcf\"" Apr 24 15:12:57.131325 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:57.131244 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8f8q8/must-gather-dkrz8"] Apr 24 15:12:57.244561 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:57.244524 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtm2g\" (UniqueName: \"kubernetes.io/projected/2ee92c67-7b88-4196-ab85-1433818abb05-kube-api-access-jtm2g\") pod \"must-gather-dkrz8\" (UID: \"2ee92c67-7b88-4196-ab85-1433818abb05\") " pod="openshift-must-gather-8f8q8/must-gather-dkrz8" Apr 24 15:12:57.244561 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:57.244568 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2ee92c67-7b88-4196-ab85-1433818abb05-must-gather-output\") pod \"must-gather-dkrz8\" (UID: \"2ee92c67-7b88-4196-ab85-1433818abb05\") " pod="openshift-must-gather-8f8q8/must-gather-dkrz8" Apr 24 15:12:57.345563 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:57.345528 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jtm2g\" (UniqueName: \"kubernetes.io/projected/2ee92c67-7b88-4196-ab85-1433818abb05-kube-api-access-jtm2g\") pod \"must-gather-dkrz8\" (UID: \"2ee92c67-7b88-4196-ab85-1433818abb05\") " pod="openshift-must-gather-8f8q8/must-gather-dkrz8" Apr 24 15:12:57.345752 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:57.345574 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2ee92c67-7b88-4196-ab85-1433818abb05-must-gather-output\") pod \"must-gather-dkrz8\" (UID: \"2ee92c67-7b88-4196-ab85-1433818abb05\") " pod="openshift-must-gather-8f8q8/must-gather-dkrz8" Apr 24 15:12:57.345980 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:57.345958 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2ee92c67-7b88-4196-ab85-1433818abb05-must-gather-output\") pod \"must-gather-dkrz8\" (UID: \"2ee92c67-7b88-4196-ab85-1433818abb05\") " pod="openshift-must-gather-8f8q8/must-gather-dkrz8" Apr 24 15:12:57.354064 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:57.354036 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtm2g\" (UniqueName: \"kubernetes.io/projected/2ee92c67-7b88-4196-ab85-1433818abb05-kube-api-access-jtm2g\") pod \"must-gather-dkrz8\" (UID: \"2ee92c67-7b88-4196-ab85-1433818abb05\") " pod="openshift-must-gather-8f8q8/must-gather-dkrz8" Apr 24 15:12:57.434799 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:57.434714 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8f8q8/must-gather-dkrz8" Apr 24 15:12:57.563684 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:57.563660 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8f8q8/must-gather-dkrz8"] Apr 24 15:12:57.565600 ip-10-0-129-77 kubenswrapper[2570]: W0424 15:12:57.565572 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ee92c67_7b88_4196_ab85_1433818abb05.slice/crio-47326ac3ab977240e813a53e13b22664fbcad9f4a5e6900a14637c0c2000a0da WatchSource:0}: Error finding container 47326ac3ab977240e813a53e13b22664fbcad9f4a5e6900a14637c0c2000a0da: Status 404 returned error can't find the container with id 47326ac3ab977240e813a53e13b22664fbcad9f4a5e6900a14637c0c2000a0da Apr 24 15:12:58.069459 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:58.069421 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8f8q8/must-gather-dkrz8" event={"ID":"2ee92c67-7b88-4196-ab85-1433818abb05","Type":"ContainerStarted","Data":"47326ac3ab977240e813a53e13b22664fbcad9f4a5e6900a14637c0c2000a0da"} Apr 24 15:12:59.075584 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:59.075494 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8f8q8/must-gather-dkrz8" event={"ID":"2ee92c67-7b88-4196-ab85-1433818abb05","Type":"ContainerStarted","Data":"e5413174080e58fb4ee75b39bd390597593f85a04386784d6a5da1e6b87a70f0"} Apr 24 15:12:59.075584 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:59.075549 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8f8q8/must-gather-dkrz8" event={"ID":"2ee92c67-7b88-4196-ab85-1433818abb05","Type":"ContainerStarted","Data":"e0f24af01cce82a4ce41715b8c32f7b2599d8b7197e7cecd02292748c7743892"} Apr 24 15:12:59.092160 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:59.092099 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8f8q8/must-gather-dkrz8" podStartSLOduration=1.158999794 podStartE2EDuration="2.092080265s" podCreationTimestamp="2026-04-24 15:12:57 +0000 UTC" firstStartedPulling="2026-04-24 15:12:57.567556032 +0000 UTC m=+2945.693431715" lastFinishedPulling="2026-04-24 15:12:58.5006365 +0000 UTC m=+2946.626512186" observedRunningTime="2026-04-24 15:12:59.089832986 +0000 UTC m=+2947.215708691" watchObservedRunningTime="2026-04-24 15:12:59.092080265 +0000 UTC m=+2947.217955971" Apr 24 15:12:59.867437 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:12:59.867404 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-682qd_8020bd6f-9604-464e-8df7-c76530a5af7c/global-pull-secret-syncer/0.log" Apr 24 15:13:00.053489 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:00.053458 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-zrhjw_0ebb20a0-f56e-442b-8216-34313a45e74c/konnectivity-agent/0.log" Apr 24 15:13:00.097904 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:00.097870 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-77.ec2.internal_935c2debf0f5dd0986c38f8610254d0b/haproxy/0.log" Apr 24 15:13:01.971051 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:01.971010 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-m85mj/must-gather-bncdl"] Apr 24 15:13:01.972544 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:01.972245 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-m85mj/must-gather-bncdl" podUID="8329532f-e59a-4a80-b537-66ea1ca2252e" containerName="copy" containerID="cri-o://1b5fab6627d4ac0191825f093f4e63365c10ea7d29c45607e1b24c9c145be9db" gracePeriod=2 Apr 24 15:13:01.974750 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:01.974720 2570 status_manager.go:895] "Failed to get status for pod" podUID="8329532f-e59a-4a80-b537-66ea1ca2252e" pod="openshift-must-gather-m85mj/must-gather-bncdl" err="pods \"must-gather-bncdl\" is forbidden: User \"system:node:ip-10-0-129-77.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-m85mj\": no relationship found between node 'ip-10-0-129-77.ec2.internal' and this object" Apr 24 15:13:01.977898 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:01.977857 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-m85mj/must-gather-bncdl"] Apr 24 15:13:02.362082 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:02.362009 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-m85mj_must-gather-bncdl_8329532f-e59a-4a80-b537-66ea1ca2252e/copy/0.log" Apr 24 15:13:02.362851 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:02.362585 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m85mj/must-gather-bncdl" Apr 24 15:13:02.365843 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:02.365798 2570 status_manager.go:895] "Failed to get status for pod" podUID="8329532f-e59a-4a80-b537-66ea1ca2252e" pod="openshift-must-gather-m85mj/must-gather-bncdl" err="pods \"must-gather-bncdl\" is forbidden: User \"system:node:ip-10-0-129-77.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-m85mj\": no relationship found between node 'ip-10-0-129-77.ec2.internal' and this object" Apr 24 15:13:02.419892 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:02.419813 2570 status_manager.go:895] "Failed to get status for pod" podUID="8329532f-e59a-4a80-b537-66ea1ca2252e" pod="openshift-must-gather-m85mj/must-gather-bncdl" err="pods \"must-gather-bncdl\" is forbidden: User \"system:node:ip-10-0-129-77.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-m85mj\": no relationship found between node 'ip-10-0-129-77.ec2.internal' and this object" Apr 24 15:13:02.500718 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:02.500681 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8329532f-e59a-4a80-b537-66ea1ca2252e-must-gather-output\") pod \"8329532f-e59a-4a80-b537-66ea1ca2252e\" (UID: \"8329532f-e59a-4a80-b537-66ea1ca2252e\") " Apr 24 15:13:02.500968 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:02.500955 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-767xr\" (UniqueName: \"kubernetes.io/projected/8329532f-e59a-4a80-b537-66ea1ca2252e-kube-api-access-767xr\") pod \"8329532f-e59a-4a80-b537-66ea1ca2252e\" (UID: \"8329532f-e59a-4a80-b537-66ea1ca2252e\") " Apr 24 15:13:02.503784 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:02.503010 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8329532f-e59a-4a80-b537-66ea1ca2252e-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "8329532f-e59a-4a80-b537-66ea1ca2252e" (UID: "8329532f-e59a-4a80-b537-66ea1ca2252e"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:13:02.504731 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:02.504679 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8329532f-e59a-4a80-b537-66ea1ca2252e-kube-api-access-767xr" (OuterVolumeSpecName: "kube-api-access-767xr") pod "8329532f-e59a-4a80-b537-66ea1ca2252e" (UID: "8329532f-e59a-4a80-b537-66ea1ca2252e"). InnerVolumeSpecName "kube-api-access-767xr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 15:13:02.602238 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:02.602172 2570 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8329532f-e59a-4a80-b537-66ea1ca2252e-must-gather-output\") on node \"ip-10-0-129-77.ec2.internal\" DevicePath \"\"" Apr 24 15:13:02.602238 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:02.602212 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-767xr\" (UniqueName: \"kubernetes.io/projected/8329532f-e59a-4a80-b537-66ea1ca2252e-kube-api-access-767xr\") on node \"ip-10-0-129-77.ec2.internal\" DevicePath \"\"" Apr 24 15:13:03.093327 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:03.093298 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-m85mj_must-gather-bncdl_8329532f-e59a-4a80-b537-66ea1ca2252e/copy/0.log" Apr 24 15:13:03.094285 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:03.094255 2570 generic.go:358] "Generic (PLEG): container finished" podID="8329532f-e59a-4a80-b537-66ea1ca2252e" containerID="1b5fab6627d4ac0191825f093f4e63365c10ea7d29c45607e1b24c9c145be9db" exitCode=143 Apr 24 15:13:03.094478 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:03.094466 2570 scope.go:117] "RemoveContainer" containerID="1b5fab6627d4ac0191825f093f4e63365c10ea7d29c45607e1b24c9c145be9db" Apr 24 15:13:03.094733 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:03.094706 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m85mj/must-gather-bncdl" Apr 24 15:13:03.119549 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:03.119441 2570 scope.go:117] "RemoveContainer" containerID="e4496c265b6e1b6e322bbe3dc8cc427eb907659b97ba9a7896084cabedda2b61" Apr 24 15:13:03.152892 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:03.152850 2570 scope.go:117] "RemoveContainer" containerID="1b5fab6627d4ac0191825f093f4e63365c10ea7d29c45607e1b24c9c145be9db" Apr 24 15:13:03.154046 ip-10-0-129-77 kubenswrapper[2570]: E0424 15:13:03.154005 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b5fab6627d4ac0191825f093f4e63365c10ea7d29c45607e1b24c9c145be9db\": container with ID starting with 1b5fab6627d4ac0191825f093f4e63365c10ea7d29c45607e1b24c9c145be9db not found: ID does not exist" containerID="1b5fab6627d4ac0191825f093f4e63365c10ea7d29c45607e1b24c9c145be9db" Apr 24 15:13:03.154169 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:03.154069 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b5fab6627d4ac0191825f093f4e63365c10ea7d29c45607e1b24c9c145be9db"} err="failed to get container status \"1b5fab6627d4ac0191825f093f4e63365c10ea7d29c45607e1b24c9c145be9db\": rpc error: code = NotFound desc = could not find container \"1b5fab6627d4ac0191825f093f4e63365c10ea7d29c45607e1b24c9c145be9db\": container with ID starting with 1b5fab6627d4ac0191825f093f4e63365c10ea7d29c45607e1b24c9c145be9db not found: ID does not exist" Apr 24 15:13:03.154169 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:03.154098 2570 scope.go:117] "RemoveContainer" containerID="e4496c265b6e1b6e322bbe3dc8cc427eb907659b97ba9a7896084cabedda2b61" Apr 24 15:13:03.154453 ip-10-0-129-77 kubenswrapper[2570]: E0424 15:13:03.154428 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4496c265b6e1b6e322bbe3dc8cc427eb907659b97ba9a7896084cabedda2b61\": container with ID starting with e4496c265b6e1b6e322bbe3dc8cc427eb907659b97ba9a7896084cabedda2b61 not found: ID does not exist" containerID="e4496c265b6e1b6e322bbe3dc8cc427eb907659b97ba9a7896084cabedda2b61" Apr 24 15:13:03.154521 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:03.154462 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4496c265b6e1b6e322bbe3dc8cc427eb907659b97ba9a7896084cabedda2b61"} err="failed to get container status \"e4496c265b6e1b6e322bbe3dc8cc427eb907659b97ba9a7896084cabedda2b61\": rpc error: code = NotFound desc = could not find container \"e4496c265b6e1b6e322bbe3dc8cc427eb907659b97ba9a7896084cabedda2b61\": container with ID starting with e4496c265b6e1b6e322bbe3dc8cc427eb907659b97ba9a7896084cabedda2b61 not found: ID does not exist" Apr 24 15:13:03.701382 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:03.701341 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-p284p_788ee4fa-ade7-40eb-ba34-7fa69f106caf/cluster-monitoring-operator/0.log" Apr 24 15:13:03.801392 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:03.801312 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-585c75f8cc-9sn69_38cf486e-f496-4330-87d5-dbbe5a60b7ce/metrics-server/0.log" Apr 24 15:13:03.934270 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:03.934231 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-s6ldx_f8125906-514c-4f62-9b0c-82f218af43f1/node-exporter/0.log" Apr 24 15:13:03.960237 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:03.960210 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-s6ldx_f8125906-514c-4f62-9b0c-82f218af43f1/kube-rbac-proxy/0.log" Apr 24 15:13:03.986482 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:03.986450 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-s6ldx_f8125906-514c-4f62-9b0c-82f218af43f1/init-textfile/0.log" Apr 24 15:13:04.175513 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:04.175469 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315/prometheus/0.log" Apr 24 15:13:04.193716 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:04.193684 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315/config-reloader/0.log" Apr 24 15:13:04.217442 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:04.217411 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315/thanos-sidecar/0.log" Apr 24 15:13:04.243483 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:04.243455 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315/kube-rbac-proxy-web/0.log" Apr 24 15:13:04.265409 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:04.265377 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315/kube-rbac-proxy/0.log" Apr 24 15:13:04.286097 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:04.286067 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315/kube-rbac-proxy-thanos/0.log" Apr 24 15:13:04.307369 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:04.307345 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_753d6b5b-fb2e-42e1-aeb1-ce7b23ca0315/init-config-reloader/0.log" Apr 24 15:13:04.418748 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:04.418713 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8329532f-e59a-4a80-b537-66ea1ca2252e" path="/var/lib/kubelet/pods/8329532f-e59a-4a80-b537-66ea1ca2252e/volumes" Apr 24 15:13:04.428103 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:04.428031 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-b9bf7858d-dwf2g_bc1a78af-2e01-47bb-b73e-143d3e005ea0/telemeter-client/0.log" Apr 24 15:13:04.450853 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:04.450824 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-b9bf7858d-dwf2g_bc1a78af-2e01-47bb-b73e-143d3e005ea0/reload/0.log" Apr 24 15:13:04.481943 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:04.481898 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-b9bf7858d-dwf2g_bc1a78af-2e01-47bb-b73e-143d3e005ea0/kube-rbac-proxy/0.log" Apr 24 15:13:04.521895 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:04.521869 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b99977797-jpktx_4b0b822c-db44-478f-a5c4-349987078d8c/thanos-query/0.log" Apr 24 15:13:04.543392 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:04.543355 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b99977797-jpktx_4b0b822c-db44-478f-a5c4-349987078d8c/kube-rbac-proxy-web/0.log" Apr 24 15:13:04.565429 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:04.565392 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b99977797-jpktx_4b0b822c-db44-478f-a5c4-349987078d8c/kube-rbac-proxy/0.log" Apr 24 15:13:04.587515 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:04.587477 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b99977797-jpktx_4b0b822c-db44-478f-a5c4-349987078d8c/prom-label-proxy/0.log" Apr 24 15:13:04.610018 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:04.609987 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b99977797-jpktx_4b0b822c-db44-478f-a5c4-349987078d8c/kube-rbac-proxy-rules/0.log" Apr 24 15:13:04.633843 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:04.633809 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b99977797-jpktx_4b0b822c-db44-478f-a5c4-349987078d8c/kube-rbac-proxy-metrics/0.log" Apr 24 15:13:07.724341 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:07.724304 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8f8q8/perf-node-gather-daemonset-6gbhf"] Apr 24 15:13:07.724845 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:07.724823 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8329532f-e59a-4a80-b537-66ea1ca2252e" containerName="copy" Apr 24 15:13:07.724895 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:07.724851 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="8329532f-e59a-4a80-b537-66ea1ca2252e" containerName="copy" Apr 24 15:13:07.724895 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:07.724870 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8329532f-e59a-4a80-b537-66ea1ca2252e" containerName="gather" Apr 24 15:13:07.724895 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:07.724880 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="8329532f-e59a-4a80-b537-66ea1ca2252e" containerName="gather" Apr 24 15:13:07.724994 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:07.724957 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="8329532f-e59a-4a80-b537-66ea1ca2252e" containerName="gather" Apr 24 15:13:07.724994 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:07.724973 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="8329532f-e59a-4a80-b537-66ea1ca2252e" containerName="copy" Apr 24 15:13:07.729894 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:07.729870 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-6gbhf" Apr 24 15:13:07.736538 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:07.736499 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8f8q8/perf-node-gather-daemonset-6gbhf"] Apr 24 15:13:07.751114 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:07.751085 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hs6xt_2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e/dns/0.log" Apr 24 15:13:07.772965 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:07.772936 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hs6xt_2d92e4ce-51ba-4d8c-ae8e-bd3994ddf63e/kube-rbac-proxy/0.log" Apr 24 15:13:07.837348 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:07.837319 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7l5sm_67732aa7-95a9-4a96-9e40-a2a525b77a52/dns-node-resolver/0.log" Apr 24 15:13:07.852239 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:07.852207 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88hss\" (UniqueName: \"kubernetes.io/projected/7e7e57cc-0c3d-4183-8c4b-9ff2a4136b59-kube-api-access-88hss\") pod \"perf-node-gather-daemonset-6gbhf\" (UID: \"7e7e57cc-0c3d-4183-8c4b-9ff2a4136b59\") " pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-6gbhf" Apr 24 15:13:07.852418 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:07.852260 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7e7e57cc-0c3d-4183-8c4b-9ff2a4136b59-sys\") pod \"perf-node-gather-daemonset-6gbhf\" (UID: \"7e7e57cc-0c3d-4183-8c4b-9ff2a4136b59\") " pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-6gbhf" Apr 24 15:13:07.852418 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:07.852282 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7e7e57cc-0c3d-4183-8c4b-9ff2a4136b59-proc\") pod \"perf-node-gather-daemonset-6gbhf\" (UID: \"7e7e57cc-0c3d-4183-8c4b-9ff2a4136b59\") " pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-6gbhf" Apr 24 15:13:07.852418 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:07.852375 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7e7e57cc-0c3d-4183-8c4b-9ff2a4136b59-lib-modules\") pod \"perf-node-gather-daemonset-6gbhf\" (UID: \"7e7e57cc-0c3d-4183-8c4b-9ff2a4136b59\") " pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-6gbhf" Apr 24 15:13:07.852418 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:07.852405 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7e7e57cc-0c3d-4183-8c4b-9ff2a4136b59-podres\") pod \"perf-node-gather-daemonset-6gbhf\" (UID: \"7e7e57cc-0c3d-4183-8c4b-9ff2a4136b59\") " pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-6gbhf" Apr 24 15:13:07.953452 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:07.953422 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7e7e57cc-0c3d-4183-8c4b-9ff2a4136b59-sys\") pod \"perf-node-gather-daemonset-6gbhf\" (UID: \"7e7e57cc-0c3d-4183-8c4b-9ff2a4136b59\") " pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-6gbhf" Apr 24 15:13:07.953452 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:07.953456 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7e7e57cc-0c3d-4183-8c4b-9ff2a4136b59-proc\") pod \"perf-node-gather-daemonset-6gbhf\" (UID: \"7e7e57cc-0c3d-4183-8c4b-9ff2a4136b59\") " pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-6gbhf" Apr 24 15:13:07.953706 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:07.953568 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7e7e57cc-0c3d-4183-8c4b-9ff2a4136b59-proc\") pod \"perf-node-gather-daemonset-6gbhf\" (UID: \"7e7e57cc-0c3d-4183-8c4b-9ff2a4136b59\") " pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-6gbhf" Apr 24 15:13:07.953706 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:07.953577 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7e7e57cc-0c3d-4183-8c4b-9ff2a4136b59-sys\") pod \"perf-node-gather-daemonset-6gbhf\" (UID: \"7e7e57cc-0c3d-4183-8c4b-9ff2a4136b59\") " pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-6gbhf" Apr 24 15:13:07.953706 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:07.953597 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7e7e57cc-0c3d-4183-8c4b-9ff2a4136b59-lib-modules\") pod \"perf-node-gather-daemonset-6gbhf\" (UID: \"7e7e57cc-0c3d-4183-8c4b-9ff2a4136b59\") " pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-6gbhf" Apr 24 15:13:07.953706 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:07.953670 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7e7e57cc-0c3d-4183-8c4b-9ff2a4136b59-podres\") pod \"perf-node-gather-daemonset-6gbhf\" (UID: \"7e7e57cc-0c3d-4183-8c4b-9ff2a4136b59\") " pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-6gbhf" Apr 24 15:13:07.953838 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:07.953706 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7e7e57cc-0c3d-4183-8c4b-9ff2a4136b59-lib-modules\") pod \"perf-node-gather-daemonset-6gbhf\" (UID: \"7e7e57cc-0c3d-4183-8c4b-9ff2a4136b59\") " pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-6gbhf" Apr 24 15:13:07.953838 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:07.953724 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-88hss\" (UniqueName: \"kubernetes.io/projected/7e7e57cc-0c3d-4183-8c4b-9ff2a4136b59-kube-api-access-88hss\") pod \"perf-node-gather-daemonset-6gbhf\" (UID: \"7e7e57cc-0c3d-4183-8c4b-9ff2a4136b59\") " pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-6gbhf" Apr 24 15:13:07.953838 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:07.953790 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7e7e57cc-0c3d-4183-8c4b-9ff2a4136b59-podres\") pod \"perf-node-gather-daemonset-6gbhf\" (UID: \"7e7e57cc-0c3d-4183-8c4b-9ff2a4136b59\") " pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-6gbhf" Apr 24 15:13:07.961341 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:07.961315 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-88hss\" (UniqueName: \"kubernetes.io/projected/7e7e57cc-0c3d-4183-8c4b-9ff2a4136b59-kube-api-access-88hss\") pod \"perf-node-gather-daemonset-6gbhf\" (UID: \"7e7e57cc-0c3d-4183-8c4b-9ff2a4136b59\") " pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-6gbhf" Apr 24 15:13:08.042504 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:08.042427 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-6gbhf" Apr 24 15:13:08.182141 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:08.182105 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8f8q8/perf-node-gather-daemonset-6gbhf"] Apr 24 15:13:08.189181 ip-10-0-129-77 kubenswrapper[2570]: W0424 15:13:08.189127 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7e7e57cc_0c3d_4183_8c4b_9ff2a4136b59.slice/crio-d6dcae3a6e26522df9151e5a74bfc784ecf5de5f56b606f748d7219b998e8ef0 WatchSource:0}: Error finding container d6dcae3a6e26522df9151e5a74bfc784ecf5de5f56b606f748d7219b998e8ef0: Status 404 returned error can't find the container with id d6dcae3a6e26522df9151e5a74bfc784ecf5de5f56b606f748d7219b998e8ef0 Apr 24 15:13:08.310353 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:08.310324 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7cd9495d59-q64t7_2a9e2553-3905-4307-bccc-cf5c6355779f/registry/0.log" Apr 24 15:13:08.328803 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:08.328783 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-629tj_9bc5ec31-bf4b-46de-abb2-21a96bd6160a/node-ca/0.log" Apr 24 15:13:09.125419 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:09.125327 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-6gbhf" event={"ID":"7e7e57cc-0c3d-4183-8c4b-9ff2a4136b59","Type":"ContainerStarted","Data":"db83400b0b9ddbe36f49fddb5dcfdd905346df738da15518287e6f4853ea1f4c"} Apr 24 15:13:09.125419 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:09.125367 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-6gbhf" event={"ID":"7e7e57cc-0c3d-4183-8c4b-9ff2a4136b59","Type":"ContainerStarted","Data":"d6dcae3a6e26522df9151e5a74bfc784ecf5de5f56b606f748d7219b998e8ef0"} Apr 24 15:13:09.125419 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:09.125412 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-6gbhf" Apr 24 15:13:09.434692 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:09.434577 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-lkncg_7ab7e98f-94b0-4d9b-9d46-6666f05fb86a/serve-healthcheck-canary/0.log" Apr 24 15:13:09.780241 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:09.780161 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-hp8qf_562c06c8-1f5f-456c-81f3-a499e9769f12/insights-operator/0.log" Apr 24 15:13:09.782482 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:09.782459 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-hp8qf_562c06c8-1f5f-456c-81f3-a499e9769f12/insights-operator/1.log" Apr 24 15:13:09.801491 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:09.801464 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9z6hz_445ae754-e43b-4a53-8625-8ecbb1ab28b1/kube-rbac-proxy/0.log" Apr 24 15:13:09.821307 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:09.821284 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9z6hz_445ae754-e43b-4a53-8625-8ecbb1ab28b1/exporter/0.log" Apr 24 15:13:09.840739 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:09.840708 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9z6hz_445ae754-e43b-4a53-8625-8ecbb1ab28b1/extractor/0.log" Apr 24 15:13:12.188027 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:12.187996 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-nf5lr_8f0e4c32-689a-4060-beba-ab06014cf676/s3-init/0.log" Apr 24 15:13:15.139936 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:15.139907 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-6gbhf" Apr 24 15:13:15.157098 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:15.157054 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8f8q8/perf-node-gather-daemonset-6gbhf" podStartSLOduration=8.157039327 podStartE2EDuration="8.157039327s" podCreationTimestamp="2026-04-24 15:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 15:13:09.140297969 +0000 UTC m=+2957.266173719" watchObservedRunningTime="2026-04-24 15:13:15.157039327 +0000 UTC m=+2963.282915030" Apr 24 15:13:15.770990 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:15.770964 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-w6fhd_eb780e77-4955-4615-851f-ddc719d4d780/migrator/0.log" Apr 24 15:13:15.790124 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:15.790090 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-w6fhd_eb780e77-4955-4615-851f-ddc719d4d780/graceful-termination/0.log" Apr 24 15:13:16.131528 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:16.131490 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-wtc7l_01600e7e-779b-41ca-b62a-79288fc11666/kube-storage-version-migrator-operator/1.log" Apr 24 15:13:16.132471 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:16.132449 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-wtc7l_01600e7e-779b-41ca-b62a-79288fc11666/kube-storage-version-migrator-operator/0.log" Apr 24 15:13:17.430105 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:17.430078 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-96scn_03c62f6c-4b55-4a95-82a9-797e22c98930/kube-multus-additional-cni-plugins/0.log" Apr 24 15:13:17.455284 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:17.455262 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-96scn_03c62f6c-4b55-4a95-82a9-797e22c98930/egress-router-binary-copy/0.log" Apr 24 15:13:17.480235 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:17.480204 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-96scn_03c62f6c-4b55-4a95-82a9-797e22c98930/cni-plugins/0.log" Apr 24 15:13:17.502728 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:17.502692 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-96scn_03c62f6c-4b55-4a95-82a9-797e22c98930/bond-cni-plugin/0.log" Apr 24 15:13:17.524413 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:17.524385 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-96scn_03c62f6c-4b55-4a95-82a9-797e22c98930/routeoverride-cni/0.log" Apr 24 15:13:17.546519 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:17.546489 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-96scn_03c62f6c-4b55-4a95-82a9-797e22c98930/whereabouts-cni-bincopy/0.log" Apr 24 15:13:17.567896 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:17.567866 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-96scn_03c62f6c-4b55-4a95-82a9-797e22c98930/whereabouts-cni/0.log" Apr 24 15:13:17.811352 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:17.811271 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qhq2p_e37f0e7d-3cac-4f35-a7d4-ba18c81bc1c4/kube-multus/0.log" Apr 24 15:13:17.859908 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:17.859877 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-f5bf4_90958440-ae13-4f74-8dc0-73b738f79139/network-metrics-daemon/0.log" Apr 24 15:13:17.879251 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:17.879225 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-f5bf4_90958440-ae13-4f74-8dc0-73b738f79139/kube-rbac-proxy/0.log" Apr 24 15:13:19.097026 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:19.096983 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbjt7_77cbed1f-c18d-4a43-bcc3-230e23453a72/ovn-controller/0.log" Apr 24 15:13:19.145247 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:19.145216 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbjt7_77cbed1f-c18d-4a43-bcc3-230e23453a72/ovn-acl-logging/0.log" Apr 24 15:13:19.168553 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:19.168523 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbjt7_77cbed1f-c18d-4a43-bcc3-230e23453a72/kube-rbac-proxy-node/0.log" Apr 24 15:13:19.188804 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:19.188773 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbjt7_77cbed1f-c18d-4a43-bcc3-230e23453a72/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 15:13:19.205864 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:19.205774 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbjt7_77cbed1f-c18d-4a43-bcc3-230e23453a72/northd/0.log" Apr 24 15:13:19.226587 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:19.226555 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbjt7_77cbed1f-c18d-4a43-bcc3-230e23453a72/nbdb/0.log" Apr 24 15:13:19.245841 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:19.245808 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbjt7_77cbed1f-c18d-4a43-bcc3-230e23453a72/sbdb/0.log" Apr 24 15:13:19.354938 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:19.354909 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbjt7_77cbed1f-c18d-4a43-bcc3-230e23453a72/ovnkube-controller/0.log" Apr 24 15:13:20.597156 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:20.597129 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-fkvkp_3640d87a-9a53-41b1-912e-39a56479c86c/network-check-target-container/0.log" Apr 24 15:13:21.520278 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:21.520251 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-v5zt2_562db2e5-0cd5-452e-9208-ceb438b32453/iptables-alerter/0.log" Apr 24 15:13:22.116410 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:22.116365 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-52ssx_b2d9a8e7-5a32-4737-96ef-7cb670736a8b/tuned/0.log" Apr 24 15:13:23.868650 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:23.868603 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-kmprr_02995ee7-a76c-4aff-b87c-83b4540741cb/cluster-samples-operator/0.log" Apr 24 15:13:23.884105 ip-10-0-129-77 kubenswrapper[2570]: I0424 15:13:23.884076 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-kmprr_02995ee7-a76c-4aff-b87c-83b4540741cb/cluster-samples-operator-watch/0.log"