Apr 16 10:04:37.273821 ip-10-0-135-1 systemd[1]: Starting Kubernetes Kubelet... Apr 16 10:04:37.730071 ip-10-0-135-1 kubenswrapper[2525]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 10:04:37.730071 ip-10-0-135-1 kubenswrapper[2525]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 10:04:37.730071 ip-10-0-135-1 kubenswrapper[2525]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 10:04:37.730071 ip-10-0-135-1 kubenswrapper[2525]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 10:04:37.730071 ip-10-0-135-1 kubenswrapper[2525]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 10:04:37.732968 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.732866 2525 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 10:04:37.738069 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738051 2525 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 10:04:37.738069 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738071 2525 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 10:04:37.738140 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738075 2525 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 10:04:37.738140 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738079 2525 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 10:04:37.738140 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738082 2525 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 10:04:37.738140 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738103 2525 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 10:04:37.738140 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738106 2525 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 10:04:37.738140 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738110 2525 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 10:04:37.738140 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738112 2525 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 10:04:37.738140 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738115 2525 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 10:04:37.738140 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738118 2525 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 10:04:37.738140 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738121 2525 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 10:04:37.738140 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738123 2525 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 10:04:37.738140 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738126 2525 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 10:04:37.738140 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738128 2525 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 10:04:37.738140 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738131 2525 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 10:04:37.738140 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738134 2525 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 10:04:37.738140 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738136 2525 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 10:04:37.738140 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738139 2525 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 10:04:37.738140 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738141 2525 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 10:04:37.738140 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738144 2525 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 10:04:37.738140 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738147 2525 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 10:04:37.738624 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738150 2525 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 10:04:37.738624 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738153 2525 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 10:04:37.738624 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738156 2525 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 10:04:37.738624 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738158 2525 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 10:04:37.738624 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738161 2525 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 10:04:37.738624 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738164 2525 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 10:04:37.738624 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738166 2525 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 10:04:37.738624 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738168 2525 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 10:04:37.738624 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738171 2525 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 10:04:37.738624 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738173 2525 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 10:04:37.738624 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738176 2525 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 10:04:37.738624 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738179 2525 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 10:04:37.738624 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738182 2525 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 10:04:37.738624 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738184 2525 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 10:04:37.738624 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738187 2525 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 10:04:37.738624 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738189 2525 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 10:04:37.738624 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738192 2525 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 10:04:37.738624 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738194 2525 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 10:04:37.738624 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738197 2525 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 10:04:37.738624 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738199 2525 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 10:04:37.739108 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738202 2525 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 10:04:37.739108 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738204 2525 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 10:04:37.739108 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738208 2525 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 10:04:37.739108 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738210 2525 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 10:04:37.739108 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738213 2525 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 10:04:37.739108 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738216 2525 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 10:04:37.739108 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738218 2525 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 10:04:37.739108 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738221 2525 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 10:04:37.739108 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738224 2525 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 10:04:37.739108 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738226 2525 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 10:04:37.739108 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738229 2525 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 10:04:37.739108 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738232 2525 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 10:04:37.739108 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738234 2525 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 10:04:37.739108 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738239 2525 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 10:04:37.739108 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738241 2525 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 10:04:37.739108 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738246 2525 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 10:04:37.739108 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738250 2525 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 10:04:37.739108 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738254 2525 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 10:04:37.739108 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738257 2525 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 10:04:37.739588 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738259 2525 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 10:04:37.739588 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738262 2525 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 10:04:37.739588 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738265 2525 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 10:04:37.739588 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738268 2525 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 10:04:37.739588 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738270 2525 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 10:04:37.739588 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738273 2525 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 10:04:37.739588 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738275 2525 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 10:04:37.739588 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738278 2525 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 10:04:37.739588 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738280 2525 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 10:04:37.739588 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738283 2525 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 10:04:37.739588 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738288 2525 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 10:04:37.739588 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738291 2525 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 10:04:37.739588 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738293 2525 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 10:04:37.739588 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738296 2525 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 10:04:37.739588 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738299 2525 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 10:04:37.739588 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738302 2525 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 10:04:37.739588 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738305 2525 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 10:04:37.739588 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738309 2525 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 10:04:37.739588 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738312 2525 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 10:04:37.740055 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738314 2525 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 10:04:37.740055 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738317 2525 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 10:04:37.740055 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738319 2525 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 10:04:37.740055 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738322 2525 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 10:04:37.740055 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738325 2525 feature_gate.go:328] unrecognized feature gate: Example Apr 16 10:04:37.740055 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738328 2525 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 10:04:37.740055 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738757 2525 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 10:04:37.740055 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738764 2525 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 10:04:37.740055 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738767 2525 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 10:04:37.740055 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738770 2525 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 10:04:37.740055 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738772 2525 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 10:04:37.740055 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738775 2525 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 10:04:37.740055 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738778 2525 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 10:04:37.740055 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738780 2525 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 10:04:37.740055 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738783 2525 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 10:04:37.740055 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738785 2525 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 10:04:37.740055 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738788 2525 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 10:04:37.740055 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738790 2525 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 10:04:37.740055 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738793 2525 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 10:04:37.740055 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738796 2525 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 10:04:37.740551 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738798 2525 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 10:04:37.740551 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738801 2525 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 10:04:37.740551 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738803 2525 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 10:04:37.740551 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738806 2525 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 10:04:37.740551 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738809 2525 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 10:04:37.740551 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738812 2525 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 10:04:37.740551 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738815 2525 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 10:04:37.740551 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738817 2525 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 10:04:37.740551 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738819 2525 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 10:04:37.740551 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738822 2525 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 10:04:37.740551 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738825 2525 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 10:04:37.740551 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738827 2525 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 10:04:37.740551 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738830 2525 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 10:04:37.740551 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738832 2525 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 10:04:37.740551 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738835 2525 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 10:04:37.740551 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738838 2525 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 10:04:37.740551 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738841 2525 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 10:04:37.740551 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738843 2525 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 10:04:37.740551 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738846 2525 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 10:04:37.740551 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738848 2525 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 10:04:37.741065 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738852 2525 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 10:04:37.741065 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738854 2525 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 10:04:37.741065 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738857 2525 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 10:04:37.741065 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738859 2525 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 10:04:37.741065 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738862 2525 feature_gate.go:328] unrecognized feature gate: Example Apr 16 10:04:37.741065 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738864 2525 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 10:04:37.741065 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738867 2525 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 10:04:37.741065 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738869 2525 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 10:04:37.741065 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738872 2525 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 10:04:37.741065 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738874 2525 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 10:04:37.741065 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738877 2525 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 10:04:37.741065 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738880 2525 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 10:04:37.741065 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738882 2525 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 10:04:37.741065 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738884 2525 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 10:04:37.741065 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738887 2525 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 10:04:37.741065 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738889 2525 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 10:04:37.741065 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738893 2525 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 10:04:37.741065 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738897 2525 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 10:04:37.741065 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738900 2525 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 10:04:37.741532 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738903 2525 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 10:04:37.741532 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738906 2525 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 10:04:37.741532 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738908 2525 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 10:04:37.741532 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738911 2525 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 10:04:37.741532 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738913 2525 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 10:04:37.741532 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738915 2525 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 10:04:37.741532 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738918 2525 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 10:04:37.741532 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738920 2525 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 10:04:37.741532 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738925 2525 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 10:04:37.741532 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738927 2525 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 10:04:37.741532 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738930 2525 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 10:04:37.741532 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738932 2525 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 10:04:37.741532 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738936 2525 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 10:04:37.741532 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738938 2525 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 10:04:37.741532 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738942 2525 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 10:04:37.741532 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738944 2525 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 10:04:37.741532 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738947 2525 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 10:04:37.741532 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738949 2525 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 10:04:37.741532 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738952 2525 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 10:04:37.741532 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738955 2525 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 10:04:37.742031 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738957 2525 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 10:04:37.742031 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738959 2525 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 10:04:37.742031 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738962 2525 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 10:04:37.742031 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738964 2525 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 10:04:37.742031 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738967 2525 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 10:04:37.742031 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738969 2525 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 10:04:37.742031 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738972 2525 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 10:04:37.742031 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738974 2525 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 10:04:37.742031 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738977 2525 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 10:04:37.742031 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738980 2525 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 10:04:37.742031 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738985 2525 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 10:04:37.742031 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738988 2525 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 10:04:37.742031 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.738991 2525 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 10:04:37.742031 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740496 2525 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 10:04:37.742031 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740522 2525 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 10:04:37.742031 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740529 2525 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 10:04:37.742031 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740534 2525 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 10:04:37.742031 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740539 2525 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 10:04:37.742031 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740543 2525 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 10:04:37.742031 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740548 2525 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 10:04:37.742031 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740553 2525 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 10:04:37.742559 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740557 2525 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 10:04:37.742559 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740560 2525 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 10:04:37.742559 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740564 2525 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 10:04:37.742559 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740568 2525 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 10:04:37.742559 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740571 2525 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 10:04:37.742559 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740574 2525 flags.go:64] FLAG: --cgroup-root="" Apr 16 10:04:37.742559 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740579 2525 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 10:04:37.742559 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740582 2525 flags.go:64] FLAG: --client-ca-file="" Apr 16 10:04:37.742559 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740585 2525 flags.go:64] FLAG: --cloud-config="" Apr 16 10:04:37.742559 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740588 2525 flags.go:64] FLAG: --cloud-provider="external" Apr 16 10:04:37.742559 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740591 2525 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 10:04:37.742559 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740596 2525 flags.go:64] FLAG: --cluster-domain="" Apr 16 10:04:37.742559 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740599 2525 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 10:04:37.742559 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740602 2525 flags.go:64] FLAG: --config-dir="" Apr 16 10:04:37.742559 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740605 2525 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 10:04:37.742559 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740610 2525 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 10:04:37.742559 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740614 2525 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 10:04:37.742559 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740617 2525 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 10:04:37.742559 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740620 2525 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 10:04:37.742559 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740623 2525 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 10:04:37.742559 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740626 2525 flags.go:64] FLAG: --contention-profiling="false" Apr 16 10:04:37.742559 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740629 2525 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 10:04:37.742559 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740633 2525 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 10:04:37.742559 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740636 2525 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 10:04:37.742559 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740639 2525 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 10:04:37.743256 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740644 2525 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 10:04:37.743256 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740647 2525 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 10:04:37.743256 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740650 2525 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 10:04:37.743256 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740653 2525 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 10:04:37.743256 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740656 2525 flags.go:64] FLAG: --enable-server="true" Apr 16 10:04:37.743256 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740659 2525 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 10:04:37.743256 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740664 2525 flags.go:64] FLAG: --event-burst="100" Apr 16 10:04:37.743256 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740667 2525 flags.go:64] FLAG: --event-qps="50" Apr 16 10:04:37.743256 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740670 2525 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 10:04:37.743256 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740673 2525 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 10:04:37.743256 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740677 2525 flags.go:64] FLAG: --eviction-hard="" Apr 16 10:04:37.743256 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740681 2525 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 10:04:37.743256 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740684 2525 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 10:04:37.743256 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740687 2525 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 10:04:37.743256 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740691 2525 flags.go:64] FLAG: --eviction-soft="" Apr 16 10:04:37.743256 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740694 2525 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 10:04:37.743256 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740697 2525 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 10:04:37.743256 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740700 2525 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 10:04:37.743256 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740703 2525 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 10:04:37.743256 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740706 2525 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 10:04:37.743256 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740709 2525 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 10:04:37.743256 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740712 2525 flags.go:64] FLAG: --feature-gates="" Apr 16 10:04:37.743256 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740716 2525 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 10:04:37.743256 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740719 2525 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 10:04:37.743256 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740722 2525 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 10:04:37.743869 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740725 2525 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 10:04:37.743869 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740729 2525 flags.go:64] FLAG: --healthz-port="10248" Apr 16 10:04:37.743869 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740732 2525 flags.go:64] FLAG: --help="false" Apr 16 10:04:37.743869 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740735 2525 flags.go:64] FLAG: --hostname-override="ip-10-0-135-1.ec2.internal" Apr 16 10:04:37.743869 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740738 2525 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 10:04:37.743869 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740741 2525 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 10:04:37.743869 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740744 2525 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 10:04:37.743869 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740747 2525 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 10:04:37.743869 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740751 2525 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 10:04:37.743869 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740754 2525 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 10:04:37.743869 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740757 2525 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 10:04:37.743869 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740761 2525 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 10:04:37.743869 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740764 2525 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 10:04:37.743869 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740767 2525 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 10:04:37.743869 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740770 2525 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 10:04:37.743869 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740773 2525 flags.go:64] FLAG: --kube-reserved="" Apr 16 10:04:37.743869 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740776 2525 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 10:04:37.743869 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740779 2525 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 10:04:37.743869 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740782 2525 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 10:04:37.743869 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740785 2525 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 10:04:37.743869 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740788 2525 flags.go:64] FLAG: --lock-file="" Apr 16 10:04:37.743869 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740790 2525 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 10:04:37.743869 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740794 2525 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 10:04:37.743869 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740797 2525 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 10:04:37.744436 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740803 2525 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 10:04:37.744436 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740806 2525 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 10:04:37.744436 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740809 2525 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 10:04:37.744436 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740812 2525 flags.go:64] FLAG: --logging-format="text" Apr 16 10:04:37.744436 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740815 2525 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 10:04:37.744436 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740818 2525 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 10:04:37.744436 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740821 2525 flags.go:64] FLAG: --manifest-url="" Apr 16 10:04:37.744436 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740825 2525 flags.go:64] FLAG: --manifest-url-header="" Apr 16 10:04:37.744436 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740830 2525 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 10:04:37.744436 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740833 2525 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 10:04:37.744436 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740837 2525 flags.go:64] FLAG: --max-pods="110" Apr 16 10:04:37.744436 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740840 2525 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 10:04:37.744436 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740843 2525 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 10:04:37.744436 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740846 2525 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 10:04:37.744436 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740849 2525 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 10:04:37.744436 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740852 2525 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 10:04:37.744436 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740856 2525 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 10:04:37.744436 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740859 2525 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 10:04:37.744436 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740867 2525 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 10:04:37.744436 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740870 2525 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 10:04:37.744436 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740873 2525 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 10:04:37.744436 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740876 2525 flags.go:64] FLAG: --pod-cidr="" Apr 16 10:04:37.744436 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740879 2525 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 10:04:37.745019 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740885 2525 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 10:04:37.745019 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740888 2525 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 10:04:37.745019 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740891 2525 flags.go:64] FLAG: --pods-per-core="0" Apr 16 10:04:37.745019 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740895 2525 flags.go:64] FLAG: --port="10250" Apr 16 10:04:37.745019 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740898 2525 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 10:04:37.745019 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740900 2525 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0ec24ab627560083d" Apr 16 10:04:37.745019 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740904 2525 flags.go:64] FLAG: --qos-reserved="" Apr 16 10:04:37.745019 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740907 2525 flags.go:64] FLAG: --read-only-port="10255" Apr 16 10:04:37.745019 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740911 2525 flags.go:64] FLAG: --register-node="true" Apr 16 10:04:37.745019 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740914 2525 flags.go:64] FLAG: --register-schedulable="true" Apr 16 10:04:37.745019 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740916 2525 flags.go:64] FLAG: --register-with-taints="" Apr 16 10:04:37.745019 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740920 2525 flags.go:64] FLAG: --registry-burst="10" Apr 16 10:04:37.745019 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740923 2525 flags.go:64] FLAG: --registry-qps="5" Apr 16 10:04:37.745019 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740926 2525 flags.go:64] FLAG: --reserved-cpus="" Apr 16 10:04:37.745019 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740929 2525 flags.go:64] FLAG: --reserved-memory="" Apr 16 10:04:37.745019 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740932 2525 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 10:04:37.745019 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740936 2525 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 10:04:37.745019 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740939 2525 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 10:04:37.745019 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740942 2525 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 10:04:37.745019 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740945 2525 flags.go:64] FLAG: --runonce="false" Apr 16 10:04:37.745019 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740948 2525 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 10:04:37.745019 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740951 2525 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 10:04:37.745019 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740954 2525 flags.go:64] FLAG: --seccomp-default="false" Apr 16 10:04:37.745019 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740957 2525 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 10:04:37.745019 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740960 2525 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 10:04:37.745019 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740964 2525 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 10:04:37.745657 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740967 2525 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 10:04:37.745657 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740971 2525 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 10:04:37.745657 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740973 2525 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 10:04:37.745657 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740976 2525 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 10:04:37.745657 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740979 2525 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 10:04:37.745657 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740983 2525 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 10:04:37.745657 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740986 2525 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 10:04:37.745657 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740989 2525 flags.go:64] FLAG: --system-cgroups="" Apr 16 10:04:37.745657 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740992 2525 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 10:04:37.745657 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.740998 2525 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 10:04:37.745657 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.741001 2525 flags.go:64] FLAG: --tls-cert-file="" Apr 16 10:04:37.745657 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.741004 2525 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 10:04:37.745657 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.741009 2525 flags.go:64] FLAG: --tls-min-version="" Apr 16 10:04:37.745657 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.741011 2525 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 10:04:37.745657 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.741014 2525 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 10:04:37.745657 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.741018 2525 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 10:04:37.745657 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.741021 2525 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 10:04:37.745657 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.741024 2525 flags.go:64] FLAG: --v="2" Apr 16 10:04:37.745657 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.741028 2525 flags.go:64] FLAG: --version="false" Apr 16 10:04:37.745657 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.741032 2525 flags.go:64] FLAG: --vmodule="" Apr 16 10:04:37.745657 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.741037 2525 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 10:04:37.745657 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.741040 2525 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 10:04:37.745657 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741138 2525 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 10:04:37.745657 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741142 2525 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 10:04:37.746272 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741145 2525 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 10:04:37.746272 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741148 2525 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 10:04:37.746272 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741152 2525 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 10:04:37.746272 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741155 2525 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 10:04:37.746272 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741158 2525 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 10:04:37.746272 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741161 2525 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 10:04:37.746272 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741164 2525 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 10:04:37.746272 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741168 2525 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 10:04:37.746272 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741172 2525 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 10:04:37.746272 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741176 2525 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 10:04:37.746272 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741179 2525 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 10:04:37.746272 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741182 2525 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 10:04:37.746272 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741185 2525 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 10:04:37.746272 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741188 2525 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 10:04:37.746272 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741190 2525 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 10:04:37.746272 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741193 2525 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 10:04:37.746272 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741195 2525 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 10:04:37.746272 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741198 2525 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 10:04:37.746272 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741200 2525 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 10:04:37.746272 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741203 2525 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 10:04:37.746799 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741206 2525 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 10:04:37.746799 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741208 2525 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 10:04:37.746799 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741210 2525 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 10:04:37.746799 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741213 2525 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 10:04:37.746799 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741216 2525 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 10:04:37.746799 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741219 2525 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 10:04:37.746799 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741221 2525 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 10:04:37.746799 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741225 2525 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 10:04:37.746799 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741228 2525 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 10:04:37.746799 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741230 2525 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 10:04:37.746799 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741233 2525 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 10:04:37.746799 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741237 2525 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 10:04:37.746799 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741239 2525 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 10:04:37.746799 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741242 2525 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 10:04:37.746799 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741245 2525 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 10:04:37.746799 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741247 2525 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 10:04:37.746799 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741250 2525 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 10:04:37.746799 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741252 2525 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 10:04:37.746799 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741255 2525 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 10:04:37.747265 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741258 2525 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 10:04:37.747265 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741261 2525 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 10:04:37.747265 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741263 2525 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 10:04:37.747265 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741266 2525 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 10:04:37.747265 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741269 2525 feature_gate.go:328] unrecognized feature gate: Example Apr 16 10:04:37.747265 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741271 2525 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 10:04:37.747265 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741274 2525 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 10:04:37.747265 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741276 2525 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 10:04:37.747265 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741279 2525 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 10:04:37.747265 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741281 2525 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 10:04:37.747265 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741284 2525 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 10:04:37.747265 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741286 2525 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 10:04:37.747265 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741289 2525 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 10:04:37.747265 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741291 2525 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 10:04:37.747265 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741294 2525 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 10:04:37.747265 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741296 2525 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 10:04:37.747265 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741299 2525 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 10:04:37.747265 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741301 2525 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 10:04:37.747265 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741304 2525 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 10:04:37.747265 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741307 2525 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 10:04:37.747783 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741314 2525 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 10:04:37.747783 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741316 2525 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 10:04:37.747783 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741319 2525 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 10:04:37.747783 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741322 2525 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 10:04:37.747783 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741324 2525 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 10:04:37.747783 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741327 2525 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 10:04:37.747783 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741330 2525 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 10:04:37.747783 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741332 2525 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 10:04:37.747783 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741335 2525 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 10:04:37.747783 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741337 2525 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 10:04:37.747783 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741340 2525 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 10:04:37.747783 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741342 2525 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 10:04:37.747783 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741347 2525 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 10:04:37.747783 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741349 2525 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 10:04:37.747783 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741352 2525 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 10:04:37.747783 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741355 2525 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 10:04:37.747783 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741357 2525 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 10:04:37.747783 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741360 2525 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 10:04:37.747783 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741362 2525 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 10:04:37.747783 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741365 2525 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 10:04:37.748278 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741367 2525 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 10:04:37.748278 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741370 2525 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 10:04:37.748278 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741373 2525 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 10:04:37.748278 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741376 2525 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 10:04:37.748278 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.741379 2525 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 10:04:37.748278 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.742196 2525 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 10:04:37.752735 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.752709 2525 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 10:04:37.752735 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.752733 2525 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 10:04:37.752889 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752786 2525 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 10:04:37.752889 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752792 2525 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 10:04:37.752889 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752795 2525 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 10:04:37.752889 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752799 2525 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 10:04:37.752889 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752802 2525 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 10:04:37.752889 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752805 2525 feature_gate.go:328] unrecognized feature gate: Example Apr 16 10:04:37.752889 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752808 2525 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 10:04:37.752889 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752811 2525 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 10:04:37.752889 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752814 2525 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 10:04:37.752889 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752817 2525 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 10:04:37.752889 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752819 2525 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 10:04:37.752889 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752822 2525 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 10:04:37.752889 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752826 2525 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 10:04:37.752889 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752828 2525 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 10:04:37.752889 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752831 2525 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 10:04:37.752889 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752834 2525 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 10:04:37.752889 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752836 2525 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 10:04:37.752889 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752839 2525 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 10:04:37.752889 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752841 2525 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 10:04:37.753363 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752846 2525 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 10:04:37.753363 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752850 2525 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 10:04:37.753363 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752853 2525 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 10:04:37.753363 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752856 2525 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 10:04:37.753363 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752859 2525 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 10:04:37.753363 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752862 2525 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 10:04:37.753363 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752865 2525 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 10:04:37.753363 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752869 2525 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 10:04:37.753363 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752872 2525 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 10:04:37.753363 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752876 2525 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 10:04:37.753363 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752879 2525 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 10:04:37.753363 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752882 2525 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 10:04:37.753363 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752884 2525 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 10:04:37.753363 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752887 2525 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 10:04:37.753363 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752890 2525 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 10:04:37.753363 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752894 2525 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 10:04:37.753363 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752897 2525 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 10:04:37.753363 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752900 2525 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 10:04:37.753363 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752903 2525 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 10:04:37.753363 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752906 2525 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 10:04:37.753913 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752909 2525 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 10:04:37.753913 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752912 2525 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 10:04:37.753913 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752914 2525 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 10:04:37.753913 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752917 2525 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 10:04:37.753913 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752920 2525 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 10:04:37.753913 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752923 2525 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 10:04:37.753913 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752925 2525 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 10:04:37.753913 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752928 2525 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 10:04:37.753913 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752931 2525 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 10:04:37.753913 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752933 2525 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 10:04:37.753913 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752936 2525 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 10:04:37.753913 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752939 2525 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 10:04:37.753913 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752942 2525 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 10:04:37.753913 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752945 2525 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 10:04:37.753913 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752948 2525 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 10:04:37.753913 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752951 2525 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 10:04:37.753913 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752953 2525 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 10:04:37.753913 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752956 2525 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 10:04:37.753913 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752959 2525 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 10:04:37.753913 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752961 2525 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 10:04:37.754403 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752964 2525 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 10:04:37.754403 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752967 2525 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 10:04:37.754403 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752970 2525 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 10:04:37.754403 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752972 2525 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 10:04:37.754403 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752975 2525 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 10:04:37.754403 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752977 2525 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 10:04:37.754403 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752980 2525 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 10:04:37.754403 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752983 2525 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 10:04:37.754403 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752986 2525 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 10:04:37.754403 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752988 2525 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 10:04:37.754403 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752991 2525 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 10:04:37.754403 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752994 2525 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 10:04:37.754403 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752996 2525 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 10:04:37.754403 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.752999 2525 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 10:04:37.754403 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753001 2525 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 10:04:37.754403 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753004 2525 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 10:04:37.754403 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753007 2525 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 10:04:37.754403 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753010 2525 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 10:04:37.754403 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753014 2525 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 10:04:37.754890 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753019 2525 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 10:04:37.754890 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753021 2525 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 10:04:37.754890 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753024 2525 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 10:04:37.754890 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753027 2525 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 10:04:37.754890 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753030 2525 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 10:04:37.754890 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753033 2525 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 10:04:37.754890 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753036 2525 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 10:04:37.754890 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753038 2525 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 10:04:37.754890 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.753044 2525 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 10:04:37.754890 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753145 2525 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 10:04:37.754890 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753151 2525 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 10:04:37.754890 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753156 2525 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 10:04:37.754890 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753159 2525 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 10:04:37.754890 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753162 2525 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 10:04:37.754890 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753165 2525 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 10:04:37.755270 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753168 2525 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 10:04:37.755270 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753170 2525 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 10:04:37.755270 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753173 2525 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 10:04:37.755270 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753176 2525 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 10:04:37.755270 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753178 2525 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 10:04:37.755270 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753182 2525 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 10:04:37.755270 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753184 2525 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 10:04:37.755270 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753188 2525 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 10:04:37.755270 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753190 2525 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 10:04:37.755270 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753193 2525 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 10:04:37.755270 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753196 2525 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 10:04:37.755270 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753198 2525 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 10:04:37.755270 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753201 2525 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 10:04:37.755270 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753203 2525 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 10:04:37.755270 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753206 2525 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 10:04:37.755270 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753209 2525 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 10:04:37.755270 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753212 2525 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 10:04:37.755270 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753215 2525 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 10:04:37.755270 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753218 2525 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 10:04:37.755270 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753221 2525 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 10:04:37.755870 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753223 2525 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 10:04:37.755870 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753226 2525 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 10:04:37.755870 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753229 2525 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 10:04:37.755870 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753232 2525 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 10:04:37.755870 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753234 2525 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 10:04:37.755870 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753238 2525 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 10:04:37.755870 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753241 2525 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 10:04:37.755870 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753244 2525 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 10:04:37.755870 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753247 2525 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 10:04:37.755870 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753250 2525 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 10:04:37.755870 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753253 2525 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 10:04:37.755870 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753255 2525 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 10:04:37.755870 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753258 2525 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 10:04:37.755870 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753260 2525 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 10:04:37.755870 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753263 2525 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 10:04:37.755870 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753266 2525 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 10:04:37.755870 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753269 2525 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 10:04:37.755870 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753271 2525 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 10:04:37.755870 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753274 2525 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 10:04:37.756330 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753277 2525 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 10:04:37.756330 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753279 2525 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 10:04:37.756330 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753282 2525 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 10:04:37.756330 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753284 2525 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 10:04:37.756330 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753287 2525 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 10:04:37.756330 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753289 2525 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 10:04:37.756330 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753292 2525 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 10:04:37.756330 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753295 2525 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 10:04:37.756330 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753297 2525 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 10:04:37.756330 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753300 2525 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 10:04:37.756330 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753304 2525 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 10:04:37.756330 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753306 2525 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 10:04:37.756330 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753309 2525 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 10:04:37.756330 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753312 2525 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 10:04:37.756330 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753315 2525 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 10:04:37.756330 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753317 2525 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 10:04:37.756330 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753320 2525 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 10:04:37.756330 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753323 2525 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 10:04:37.756330 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753325 2525 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 10:04:37.756844 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753328 2525 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 10:04:37.756844 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753330 2525 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 10:04:37.756844 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753333 2525 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 10:04:37.756844 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753335 2525 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 10:04:37.756844 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753338 2525 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 10:04:37.756844 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753340 2525 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 10:04:37.756844 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753343 2525 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 10:04:37.756844 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753345 2525 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 10:04:37.756844 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753348 2525 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 10:04:37.756844 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753351 2525 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 10:04:37.756844 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753354 2525 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 10:04:37.756844 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753356 2525 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 10:04:37.756844 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753359 2525 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 10:04:37.756844 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753361 2525 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 10:04:37.756844 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753364 2525 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 10:04:37.756844 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753367 2525 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 10:04:37.756844 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753369 2525 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 10:04:37.756844 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753372 2525 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 10:04:37.756844 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753375 2525 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 10:04:37.756844 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753378 2525 feature_gate.go:328] unrecognized feature gate: Example Apr 16 10:04:37.757336 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753380 2525 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 10:04:37.757336 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:37.753383 2525 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 10:04:37.757336 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.753388 2525 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 10:04:37.757336 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.754220 2525 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 10:04:37.757336 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.756363 2525 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 10:04:37.757531 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.757388 2525 server.go:1019] "Starting client certificate rotation" Apr 16 10:04:37.757531 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.757492 2525 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 10:04:37.757635 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.757563 2525 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 10:04:37.783120 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.783088 2525 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 10:04:37.786084 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.786059 2525 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 10:04:37.803071 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.803018 2525 log.go:25] "Validated CRI v1 runtime API" Apr 16 10:04:37.808535 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.808499 2525 log.go:25] "Validated CRI v1 image API" Apr 16 10:04:37.809811 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.809796 2525 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 10:04:37.813711 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.813694 2525 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 10:04:37.813816 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.813786 2525 fs.go:135] Filesystem UUIDs: map[35f5fe7d-52b4-49ec-9677-4cd04d17ac10:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 f28ceeb0-0644-4090-a360-a173a81de404:/dev/nvme0n1p4] Apr 16 10:04:37.813914 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.813810 2525 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 10:04:37.819920 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.819810 2525 manager.go:217] Machine: {Timestamp:2026-04-16 10:04:37.817800246 +0000 UTC m=+0.417248212 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3088526 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec284eead1fd4e324a28a6b0334d3ac0 SystemUUID:ec284eea-d1fd-4e32-4a28-a6b0334d3ac0 BootID:885036c0-7094-42de-9037-41693af01f20 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:27:ed:75:b3:9b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:27:ed:75:b3:9b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ce:7e:8f:86:74:9f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 10:04:37.819920 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.819910 2525 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 10:04:37.820060 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.819997 2525 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 10:04:37.822804 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.822780 2525 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 10:04:37.822951 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.822806 2525 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-1.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 10:04:37.822993 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.822961 2525 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 10:04:37.822993 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.822970 2525 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 10:04:37.822993 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.822983 2525 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 10:04:37.823697 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.823686 2525 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 10:04:37.825061 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.825051 2525 state_mem.go:36] "Initialized new in-memory state store" Apr 16 10:04:37.825174 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.825165 2525 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 10:04:37.827321 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.827310 2525 kubelet.go:491] "Attempting to sync node with API server" Apr 16 10:04:37.827368 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.827329 2525 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 10:04:37.827368 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.827341 2525 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 10:04:37.827368 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.827350 2525 kubelet.go:397] "Adding apiserver pod source" Apr 16 10:04:37.827368 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.827359 2525 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 10:04:37.829251 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.829235 2525 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 10:04:37.829346 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.829269 2525 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 10:04:37.832599 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.832580 2525 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 10:04:37.834385 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.834371 2525 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 10:04:37.835952 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.835941 2525 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 10:04:37.836012 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.835957 2525 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 10:04:37.836012 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.835964 2525 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 10:04:37.836012 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.835969 2525 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 10:04:37.836012 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.835975 2525 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 10:04:37.836012 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.835981 2525 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 10:04:37.836012 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.835987 2525 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 10:04:37.836012 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.835992 2525 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 10:04:37.836012 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.835999 2525 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 10:04:37.836012 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.836005 2525 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 10:04:37.836012 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.836013 2525 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 10:04:37.836263 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.836022 2525 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 10:04:37.836596 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.836580 2525 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dtdnf" Apr 16 10:04:37.836774 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.836765 2525 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 10:04:37.836804 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.836777 2525 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 10:04:37.838933 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:37.838904 2525 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-1.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 10:04:37.839011 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:37.838904 2525 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 10:04:37.840471 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.840458 2525 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 10:04:37.840537 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.840498 2525 server.go:1295] "Started kubelet" Apr 16 10:04:37.840606 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.840581 2525 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 10:04:37.841360 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.841301 2525 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 10:04:37.841459 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.841387 2525 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 10:04:37.841399 ip-10-0-135-1 systemd[1]: Started Kubernetes Kubelet. Apr 16 10:04:37.842740 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.842721 2525 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 10:04:37.843970 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.843954 2525 server.go:317] "Adding debug handlers to kubelet server" Apr 16 10:04:37.847161 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.847141 2525 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dtdnf" Apr 16 10:04:37.850210 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.850190 2525 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-135-1.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 10:04:37.851071 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:37.850239 2525 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-1.ec2.internal.18a6ce401f2c5ee5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-1.ec2.internal,UID:ip-10-0-135-1.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-135-1.ec2.internal,},FirstTimestamp:2026-04-16 10:04:37.840469733 +0000 UTC m=+0.439917680,LastTimestamp:2026-04-16 10:04:37.840469733 +0000 UTC m=+0.439917680,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-1.ec2.internal,}" Apr 16 10:04:37.851920 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:37.851902 2525 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 10:04:37.852676 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.852652 2525 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 10:04:37.853190 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.853176 2525 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 10:04:37.853993 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.853970 2525 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 10:04:37.854126 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.854109 2525 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 10:04:37.854181 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.854132 2525 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 10:04:37.854272 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.854260 2525 reconstruct.go:97] "Volume reconstruction finished" Apr 16 10:04:37.854325 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.854274 2525 reconciler.go:26] "Reconciler: start to sync state" Apr 16 10:04:37.854898 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.854878 2525 factory.go:55] Registering systemd factory Apr 16 10:04:37.854980 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.854902 2525 factory.go:223] Registration of the systemd container factory successfully Apr 16 10:04:37.855078 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:37.855054 2525 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-1.ec2.internal\" not found" Apr 16 10:04:37.855247 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.855233 2525 factory.go:153] Registering CRI-O factory Apr 16 10:04:37.855368 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.855356 2525 factory.go:223] Registration of the crio container factory successfully Apr 16 10:04:37.855446 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.855433 2525 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 10:04:37.855523 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.855465 2525 factory.go:103] Registering Raw factory Apr 16 10:04:37.855523 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.855482 2525 manager.go:1196] Started watching for new ooms in manager Apr 16 10:04:37.856032 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.855955 2525 manager.go:319] Starting recovery of all containers Apr 16 10:04:37.857871 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.857850 2525 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 10:04:37.860695 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:37.860672 2525 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-135-1.ec2.internal\" not found" node="ip-10-0-135-1.ec2.internal" Apr 16 10:04:37.866233 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.866113 2525 manager.go:324] Recovery completed Apr 16 10:04:37.870544 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.870496 2525 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 10:04:37.873252 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.873233 2525 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-1.ec2.internal" event="NodeHasSufficientMemory" Apr 16 10:04:37.873332 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.873265 2525 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-1.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 10:04:37.873332 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.873278 2525 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-1.ec2.internal" event="NodeHasSufficientPID" Apr 16 10:04:37.873778 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.873764 2525 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 10:04:37.873778 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.873776 2525 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 10:04:37.873888 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.873795 2525 state_mem.go:36] "Initialized new in-memory state store" Apr 16 10:04:37.876325 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.876313 2525 policy_none.go:49] "None policy: Start" Apr 16 10:04:37.876373 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.876330 2525 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 10:04:37.876373 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.876340 2525 state_mem.go:35] "Initializing new in-memory state store" Apr 16 10:04:37.913179 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.913163 2525 manager.go:341] "Starting Device Plugin manager" Apr 16 10:04:37.927905 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:37.913197 2525 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 10:04:37.927905 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.913207 2525 server.go:85] "Starting device plugin registration server" Apr 16 10:04:37.927905 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.913447 2525 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 10:04:37.927905 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.913460 2525 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 10:04:37.927905 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.913550 2525 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 10:04:37.927905 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.913621 2525 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 10:04:37.927905 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.913630 2525 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 10:04:37.927905 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:37.914227 2525 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 10:04:37.927905 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:37.914262 2525 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-1.ec2.internal\" not found" Apr 16 10:04:37.986593 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.986520 2525 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 10:04:37.987672 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.987652 2525 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 10:04:37.987786 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.987681 2525 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 10:04:37.987786 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.987700 2525 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 10:04:37.987786 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.987709 2525 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 10:04:37.987786 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:37.987744 2525 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 10:04:37.990225 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:37.990202 2525 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 10:04:38.014094 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.014075 2525 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 10:04:38.015063 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.015050 2525 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-1.ec2.internal" event="NodeHasSufficientMemory" Apr 16 10:04:38.015140 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.015080 2525 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-1.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 10:04:38.015140 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.015092 2525 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-1.ec2.internal" event="NodeHasSufficientPID" Apr 16 10:04:38.015140 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.015117 2525 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-1.ec2.internal" Apr 16 10:04:38.023464 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.023448 2525 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-1.ec2.internal" Apr 16 10:04:38.023548 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:38.023470 2525 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-1.ec2.internal\": node \"ip-10-0-135-1.ec2.internal\" not found" Apr 16 10:04:38.045820 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:38.045792 2525 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-1.ec2.internal\" not found" Apr 16 10:04:38.088552 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.088506 2525 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-1.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-135-1.ec2.internal"] Apr 16 10:04:38.088648 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.088593 2525 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 10:04:38.089432 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.089410 2525 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-1.ec2.internal" event="NodeHasSufficientMemory" Apr 16 10:04:38.089532 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.089439 2525 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-1.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 10:04:38.089532 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.089449 2525 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-1.ec2.internal" event="NodeHasSufficientPID" Apr 16 10:04:38.090681 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.090669 2525 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 10:04:38.090856 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.090843 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-1.ec2.internal" Apr 16 10:04:38.090891 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.090872 2525 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 10:04:38.091369 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.091347 2525 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-1.ec2.internal" event="NodeHasSufficientMemory" Apr 16 10:04:38.091465 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.091379 2525 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-1.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 10:04:38.091465 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.091392 2525 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-1.ec2.internal" event="NodeHasSufficientPID" Apr 16 10:04:38.091465 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.091400 2525 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-1.ec2.internal" event="NodeHasSufficientMemory" Apr 16 10:04:38.091465 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.091424 2525 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-1.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 10:04:38.091465 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.091439 2525 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-1.ec2.internal" event="NodeHasSufficientPID" Apr 16 10:04:38.093676 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.093659 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-1.ec2.internal" Apr 16 10:04:38.093762 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.093691 2525 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 10:04:38.094362 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.094346 2525 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-1.ec2.internal" event="NodeHasSufficientMemory" Apr 16 10:04:38.094451 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.094370 2525 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-1.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 10:04:38.094451 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.094380 2525 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-1.ec2.internal" event="NodeHasSufficientPID" Apr 16 10:04:38.107030 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:38.107011 2525 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-1.ec2.internal\" not found" node="ip-10-0-135-1.ec2.internal" Apr 16 10:04:38.110785 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:38.110768 2525 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-1.ec2.internal\" not found" node="ip-10-0-135-1.ec2.internal" Apr 16 10:04:38.146547 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:38.146525 2525 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-1.ec2.internal\" not found" Apr 16 10:04:38.156399 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.156378 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d58cb40e2271cf732604ae518ffd1966-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-1.ec2.internal\" (UID: \"d58cb40e2271cf732604ae518ffd1966\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-1.ec2.internal" Apr 16 10:04:38.156473 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.156403 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d58cb40e2271cf732604ae518ffd1966-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-1.ec2.internal\" (UID: \"d58cb40e2271cf732604ae518ffd1966\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-1.ec2.internal" Apr 16 10:04:38.156473 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.156422 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b1bfdb1b2e8b28ab395d30767607393b-config\") pod \"kube-apiserver-proxy-ip-10-0-135-1.ec2.internal\" (UID: \"b1bfdb1b2e8b28ab395d30767607393b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-1.ec2.internal" Apr 16 10:04:38.247124 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:38.247049 2525 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-1.ec2.internal\" not found" Apr 16 10:04:38.257401 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.257378 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d58cb40e2271cf732604ae518ffd1966-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-1.ec2.internal\" (UID: \"d58cb40e2271cf732604ae518ffd1966\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-1.ec2.internal" Apr 16 10:04:38.257494 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.257409 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d58cb40e2271cf732604ae518ffd1966-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-1.ec2.internal\" (UID: \"d58cb40e2271cf732604ae518ffd1966\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-1.ec2.internal" Apr 16 10:04:38.257494 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.257425 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b1bfdb1b2e8b28ab395d30767607393b-config\") pod \"kube-apiserver-proxy-ip-10-0-135-1.ec2.internal\" (UID: \"b1bfdb1b2e8b28ab395d30767607393b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-1.ec2.internal" Apr 16 10:04:38.257494 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.257452 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b1bfdb1b2e8b28ab395d30767607393b-config\") pod \"kube-apiserver-proxy-ip-10-0-135-1.ec2.internal\" (UID: \"b1bfdb1b2e8b28ab395d30767607393b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-1.ec2.internal" Apr 16 10:04:38.257494 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.257480 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d58cb40e2271cf732604ae518ffd1966-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-1.ec2.internal\" (UID: \"d58cb40e2271cf732604ae518ffd1966\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-1.ec2.internal" Apr 16 10:04:38.257651 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.257480 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d58cb40e2271cf732604ae518ffd1966-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-1.ec2.internal\" (UID: \"d58cb40e2271cf732604ae518ffd1966\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-1.ec2.internal" Apr 16 10:04:38.347801 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:38.347761 2525 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-1.ec2.internal\" not found" Apr 16 10:04:38.411297 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.411271 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-1.ec2.internal" Apr 16 10:04:38.414424 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.414407 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-1.ec2.internal" Apr 16 10:04:38.448661 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:38.448625 2525 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-1.ec2.internal\" not found" Apr 16 10:04:38.549252 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:38.549166 2525 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-1.ec2.internal\" not found" Apr 16 10:04:38.649680 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:38.649648 2525 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-1.ec2.internal\" not found" Apr 16 10:04:38.750240 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:38.750206 2525 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-1.ec2.internal\" not found" Apr 16 10:04:38.757472 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.757454 2525 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 10:04:38.757619 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.757600 2525 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 10:04:38.757658 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.757635 2525 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 10:04:38.850254 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.850169 2525 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 09:59:37 +0000 UTC" deadline="2027-11-18 09:33:39.446837778 +0000 UTC" Apr 16 10:04:38.850254 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.850209 2525 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13943h29m0.596633666s" Apr 16 10:04:38.850485 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:38.850266 2525 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-1.ec2.internal\" not found" Apr 16 10:04:38.852993 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.852978 2525 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 10:04:38.867528 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.867490 2525 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 10:04:38.889185 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.889159 2525 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-jkxlr" Apr 16 10:04:38.891213 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:38.891186 2525 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd58cb40e2271cf732604ae518ffd1966.slice/crio-9a2ebb62fd9c217820861d29b6219510194111309344198ba07d1f128a890ac6 WatchSource:0}: Error finding container 9a2ebb62fd9c217820861d29b6219510194111309344198ba07d1f128a890ac6: Status 404 returned error can't find the container with id 9a2ebb62fd9c217820861d29b6219510194111309344198ba07d1f128a890ac6 Apr 16 10:04:38.891502 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:38.891479 2525 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1bfdb1b2e8b28ab395d30767607393b.slice/crio-7c91fb4fc32c439e8e923cce45311c050fba6a27d0ecb8034c9f33926f352751 WatchSource:0}: Error finding container 7c91fb4fc32c439e8e923cce45311c050fba6a27d0ecb8034c9f33926f352751: Status 404 returned error can't find the container with id 7c91fb4fc32c439e8e923cce45311c050fba6a27d0ecb8034c9f33926f352751 Apr 16 10:04:38.894479 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.894459 2525 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-jkxlr" Apr 16 10:04:38.897501 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.897488 2525 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 10:04:38.951155 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:38.951119 2525 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-1.ec2.internal\" not found" Apr 16 10:04:38.990484 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.990436 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-1.ec2.internal" event={"ID":"b1bfdb1b2e8b28ab395d30767607393b","Type":"ContainerStarted","Data":"7c91fb4fc32c439e8e923cce45311c050fba6a27d0ecb8034c9f33926f352751"} Apr 16 10:04:38.991368 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:38.991342 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-1.ec2.internal" event={"ID":"d58cb40e2271cf732604ae518ffd1966","Type":"ContainerStarted","Data":"9a2ebb62fd9c217820861d29b6219510194111309344198ba07d1f128a890ac6"} Apr 16 10:04:39.051591 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:39.051562 2525 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-1.ec2.internal\" not found" Apr 16 10:04:39.074403 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.074382 2525 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 10:04:39.152606 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:39.152543 2525 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-1.ec2.internal\" not found" Apr 16 10:04:39.253096 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:39.253063 2525 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-1.ec2.internal\" not found" Apr 16 10:04:39.353922 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:39.353887 2525 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-1.ec2.internal\" not found" Apr 16 10:04:39.383802 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.383762 2525 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 10:04:39.453877 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.453791 2525 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-1.ec2.internal" Apr 16 10:04:39.464868 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.464842 2525 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 10:04:39.465950 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.465915 2525 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-1.ec2.internal" Apr 16 10:04:39.473648 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.473623 2525 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 10:04:39.793008 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.792941 2525 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 10:04:39.828850 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.828821 2525 apiserver.go:52] "Watching apiserver" Apr 16 10:04:39.837400 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.837373 2525 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 10:04:39.838453 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.838429 2525 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-pwsbd","openshift-dns/node-resolver-scbdk","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-1.ec2.internal","openshift-multus/multus-additional-cni-plugins-fpw2w","openshift-network-diagnostics/network-check-target-nz7jd","openshift-network-operator/iptables-alerter-htz8x","kube-system/kube-apiserver-proxy-ip-10-0-135-1.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t5bxv","openshift-cluster-node-tuning-operator/tuned-mfrrt","openshift-image-registry/node-ca-hhqrb","openshift-multus/multus-pmkq5","openshift-multus/network-metrics-daemon-mczdx","openshift-ovn-kubernetes/ovnkube-node-rk78v"] Apr 16 10:04:39.841937 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.841808 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hhqrb" Apr 16 10:04:39.843088 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.843066 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-scbdk" Apr 16 10:04:39.845667 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.844710 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 10:04:39.845667 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.845411 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fpw2w" Apr 16 10:04:39.845667 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.845456 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-7nsgk\"" Apr 16 10:04:39.846185 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.846168 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 10:04:39.848234 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.846869 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nz7jd" Apr 16 10:04:39.848234 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:39.846984 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nz7jd" podUID="19d87478-ac42-469f-a6d5-8ca391f485f6" Apr 16 10:04:39.848234 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.847477 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 10:04:39.848234 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.847501 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 10:04:39.848234 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.847563 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 10:04:39.848538 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.848236 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-sb49x\"" Apr 16 10:04:39.848538 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.848332 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-htz8x" Apr 16 10:04:39.849873 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.849851 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 10:04:39.849978 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.849907 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 10:04:39.849978 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.849935 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 10:04:39.850148 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.850077 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 10:04:39.850148 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.850143 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-fdrpx\"" Apr 16 10:04:39.851413 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.850976 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t5bxv" Apr 16 10:04:39.853420 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.853400 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 10:04:39.853695 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.853664 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 10:04:39.853695 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.853686 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 10:04:39.853809 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.853719 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 10:04:39.853809 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.853759 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-2kc5f\"" Apr 16 10:04:39.853809 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.853674 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 10:04:39.854139 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.853989 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-jn6ct\"" Apr 16 10:04:39.854139 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.853999 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 10:04:39.854291 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.854246 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 10:04:39.854764 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.854744 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.856364 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.856341 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-pwsbd" Apr 16 10:04:39.856559 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.856503 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pmkq5" Apr 16 10:04:39.857006 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.856989 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-dg9zt\"" Apr 16 10:04:39.857162 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.857149 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 10:04:39.857295 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.857273 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 10:04:39.858010 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.857992 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mczdx" Apr 16 10:04:39.858107 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:39.858069 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mczdx" podUID="4739af65-2ca2-4e8f-9b0e-ddeac76a9b66" Apr 16 10:04:39.858674 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.858551 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-tglkg\"" Apr 16 10:04:39.858674 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.858587 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-qntdr\"" Apr 16 10:04:39.859149 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.858866 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 10:04:39.859149 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.858874 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 10:04:39.859149 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.859052 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 10:04:39.859758 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.859741 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:39.862336 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.862318 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 10:04:39.862548 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.862531 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 10:04:39.862692 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.862660 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 10:04:39.862692 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.862318 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 10:04:39.863138 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.863120 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 10:04:39.863220 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.863164 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 10:04:39.863325 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.863299 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-nrs6w\"" Apr 16 10:04:39.866956 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.866938 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-etc-sysconfig\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.867051 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.866966 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-lib-modules\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.867051 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.866991 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-etc-tuned\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.867051 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.867016 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eb2d0ec7-6730-4604-8528-e4a98ff4857d-host-slash\") pod \"iptables-alerter-htz8x\" (UID: \"eb2d0ec7-6730-4604-8528-e4a98ff4857d\") " pod="openshift-network-operator/iptables-alerter-htz8x" Apr 16 10:04:39.867051 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.867040 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/cd617d7e-1f50-481b-ab3f-e9a32bb0d538-etc-selinux\") pod \"aws-ebs-csi-driver-node-t5bxv\" (UID: \"cd617d7e-1f50-481b-ab3f-e9a32bb0d538\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t5bxv" Apr 16 10:04:39.867247 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.867087 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crb77\" (UniqueName: \"kubernetes.io/projected/9c0ede08-fd0f-4aba-9ec1-e44fb8d11bd4-kube-api-access-crb77\") pod \"node-resolver-scbdk\" (UID: \"9c0ede08-fd0f-4aba-9ec1-e44fb8d11bd4\") " pod="openshift-dns/node-resolver-scbdk" Apr 16 10:04:39.867247 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.867146 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dc3554db-b5f4-43be-8180-e48573e27cb6-os-release\") pod \"multus-additional-cni-plugins-fpw2w\" (UID: \"dc3554db-b5f4-43be-8180-e48573e27cb6\") " pod="openshift-multus/multus-additional-cni-plugins-fpw2w" Apr 16 10:04:39.867247 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.867172 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztrhm\" (UniqueName: \"kubernetes.io/projected/19d87478-ac42-469f-a6d5-8ca391f485f6-kube-api-access-ztrhm\") pod \"network-check-target-nz7jd\" (UID: \"19d87478-ac42-469f-a6d5-8ca391f485f6\") " pod="openshift-network-diagnostics/network-check-target-nz7jd" Apr 16 10:04:39.867247 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.867191 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/cd617d7e-1f50-481b-ab3f-e9a32bb0d538-sys-fs\") pod \"aws-ebs-csi-driver-node-t5bxv\" (UID: \"cd617d7e-1f50-481b-ab3f-e9a32bb0d538\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t5bxv" Apr 16 10:04:39.867247 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.867238 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dc3554db-b5f4-43be-8180-e48573e27cb6-system-cni-dir\") pod \"multus-additional-cni-plugins-fpw2w\" (UID: \"dc3554db-b5f4-43be-8180-e48573e27cb6\") " pod="openshift-multus/multus-additional-cni-plugins-fpw2w" Apr 16 10:04:39.867444 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.867264 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-run\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.867444 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.867287 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-sys\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.867444 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.867338 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/cd617d7e-1f50-481b-ab3f-e9a32bb0d538-device-dir\") pod \"aws-ebs-csi-driver-node-t5bxv\" (UID: \"cd617d7e-1f50-481b-ab3f-e9a32bb0d538\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t5bxv" Apr 16 10:04:39.867444 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.867371 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcn69\" (UniqueName: \"kubernetes.io/projected/cd617d7e-1f50-481b-ab3f-e9a32bb0d538-kube-api-access-kcn69\") pod \"aws-ebs-csi-driver-node-t5bxv\" (UID: \"cd617d7e-1f50-481b-ab3f-e9a32bb0d538\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t5bxv" Apr 16 10:04:39.867444 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.867409 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfbds\" (UniqueName: \"kubernetes.io/projected/dc3554db-b5f4-43be-8180-e48573e27cb6-kube-api-access-vfbds\") pod \"multus-additional-cni-plugins-fpw2w\" (UID: \"dc3554db-b5f4-43be-8180-e48573e27cb6\") " pod="openshift-multus/multus-additional-cni-plugins-fpw2w" Apr 16 10:04:39.867444 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.867434 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xbtl\" (UniqueName: \"kubernetes.io/projected/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-kube-api-access-8xbtl\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.867709 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.867458 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3912faf1-da9e-41de-8d4c-b2cdb354f252-serviceca\") pod \"node-ca-hhqrb\" (UID: \"3912faf1-da9e-41de-8d4c-b2cdb354f252\") " pod="openshift-image-registry/node-ca-hhqrb" Apr 16 10:04:39.867709 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.867481 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7t9c\" (UniqueName: \"kubernetes.io/projected/eb2d0ec7-6730-4604-8528-e4a98ff4857d-kube-api-access-r7t9c\") pod \"iptables-alerter-htz8x\" (UID: \"eb2d0ec7-6730-4604-8528-e4a98ff4857d\") " pod="openshift-network-operator/iptables-alerter-htz8x" Apr 16 10:04:39.867709 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.867545 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dc3554db-b5f4-43be-8180-e48573e27cb6-cnibin\") pod \"multus-additional-cni-plugins-fpw2w\" (UID: \"dc3554db-b5f4-43be-8180-e48573e27cb6\") " pod="openshift-multus/multus-additional-cni-plugins-fpw2w" Apr 16 10:04:39.867709 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.867570 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/dc3554db-b5f4-43be-8180-e48573e27cb6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fpw2w\" (UID: \"dc3554db-b5f4-43be-8180-e48573e27cb6\") " pod="openshift-multus/multus-additional-cni-plugins-fpw2w" Apr 16 10:04:39.867709 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.867586 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfsgt\" (UniqueName: \"kubernetes.io/projected/3912faf1-da9e-41de-8d4c-b2cdb354f252-kube-api-access-kfsgt\") pod \"node-ca-hhqrb\" (UID: \"3912faf1-da9e-41de-8d4c-b2cdb354f252\") " pod="openshift-image-registry/node-ca-hhqrb" Apr 16 10:04:39.867709 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.867603 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cd617d7e-1f50-481b-ab3f-e9a32bb0d538-socket-dir\") pod \"aws-ebs-csi-driver-node-t5bxv\" (UID: \"cd617d7e-1f50-481b-ab3f-e9a32bb0d538\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t5bxv" Apr 16 10:04:39.867709 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.867625 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-etc-kubernetes\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.867709 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.867652 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-etc-sysctl-d\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.867709 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.867668 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-etc-systemd\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.867709 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.867688 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-tmp\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.867709 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.867710 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9c0ede08-fd0f-4aba-9ec1-e44fb8d11bd4-hosts-file\") pod \"node-resolver-scbdk\" (UID: \"9c0ede08-fd0f-4aba-9ec1-e44fb8d11bd4\") " pod="openshift-dns/node-resolver-scbdk" Apr 16 10:04:39.868097 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.867749 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dc3554db-b5f4-43be-8180-e48573e27cb6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fpw2w\" (UID: \"dc3554db-b5f4-43be-8180-e48573e27cb6\") " pod="openshift-multus/multus-additional-cni-plugins-fpw2w" Apr 16 10:04:39.868097 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.867771 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-etc-modprobe-d\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.868097 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.867822 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/34e490e3-17d2-4fa4-ba1d-76247d80e91c-agent-certs\") pod \"konnectivity-agent-pwsbd\" (UID: \"34e490e3-17d2-4fa4-ba1d-76247d80e91c\") " pod="kube-system/konnectivity-agent-pwsbd" Apr 16 10:04:39.868097 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.867855 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/34e490e3-17d2-4fa4-ba1d-76247d80e91c-konnectivity-ca\") pod \"konnectivity-agent-pwsbd\" (UID: \"34e490e3-17d2-4fa4-ba1d-76247d80e91c\") " pod="kube-system/konnectivity-agent-pwsbd" Apr 16 10:04:39.868097 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.867884 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/eb2d0ec7-6730-4604-8528-e4a98ff4857d-iptables-alerter-script\") pod \"iptables-alerter-htz8x\" (UID: \"eb2d0ec7-6730-4604-8528-e4a98ff4857d\") " pod="openshift-network-operator/iptables-alerter-htz8x" Apr 16 10:04:39.868097 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.867909 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cd617d7e-1f50-481b-ab3f-e9a32bb0d538-kubelet-dir\") pod \"aws-ebs-csi-driver-node-t5bxv\" (UID: \"cd617d7e-1f50-481b-ab3f-e9a32bb0d538\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t5bxv" Apr 16 10:04:39.868097 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.867936 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9c0ede08-fd0f-4aba-9ec1-e44fb8d11bd4-tmp-dir\") pod \"node-resolver-scbdk\" (UID: \"9c0ede08-fd0f-4aba-9ec1-e44fb8d11bd4\") " pod="openshift-dns/node-resolver-scbdk" Apr 16 10:04:39.868097 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.867979 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dc3554db-b5f4-43be-8180-e48573e27cb6-cni-binary-copy\") pod \"multus-additional-cni-plugins-fpw2w\" (UID: \"dc3554db-b5f4-43be-8180-e48573e27cb6\") " pod="openshift-multus/multus-additional-cni-plugins-fpw2w" Apr 16 10:04:39.868097 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.868004 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dc3554db-b5f4-43be-8180-e48573e27cb6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fpw2w\" (UID: \"dc3554db-b5f4-43be-8180-e48573e27cb6\") " pod="openshift-multus/multus-additional-cni-plugins-fpw2w" Apr 16 10:04:39.868097 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.868030 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-etc-sysctl-conf\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.868097 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.868054 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-var-lib-kubelet\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.868605 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.868106 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-host\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.868605 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.868164 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3912faf1-da9e-41de-8d4c-b2cdb354f252-host\") pod \"node-ca-hhqrb\" (UID: \"3912faf1-da9e-41de-8d4c-b2cdb354f252\") " pod="openshift-image-registry/node-ca-hhqrb" Apr 16 10:04:39.868605 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.868193 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cd617d7e-1f50-481b-ab3f-e9a32bb0d538-registration-dir\") pod \"aws-ebs-csi-driver-node-t5bxv\" (UID: \"cd617d7e-1f50-481b-ab3f-e9a32bb0d538\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t5bxv" Apr 16 10:04:39.895143 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.895114 2525 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 09:59:38 +0000 UTC" deadline="2027-12-28 17:47:34.101544954 +0000 UTC" Apr 16 10:04:39.895143 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.895139 2525 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14911h42m54.206408099s" Apr 16 10:04:39.929844 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.929815 2525 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 10:04:39.955553 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.955526 2525 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 10:04:39.968715 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.968673 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-host\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.968715 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.968714 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3912faf1-da9e-41de-8d4c-b2cdb354f252-host\") pod \"node-ca-hhqrb\" (UID: \"3912faf1-da9e-41de-8d4c-b2cdb354f252\") " pod="openshift-image-registry/node-ca-hhqrb" Apr 16 10:04:39.968935 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.968747 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-host-run-netns\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:39.968935 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.968776 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-var-lib-openvswitch\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:39.968935 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.968805 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-etc-tuned\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.968935 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.968830 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/cd617d7e-1f50-481b-ab3f-e9a32bb0d538-etc-selinux\") pod \"aws-ebs-csi-driver-node-t5bxv\" (UID: \"cd617d7e-1f50-481b-ab3f-e9a32bb0d538\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t5bxv" Apr 16 10:04:39.968935 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.968863 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-run-ovn\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:39.968935 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.968903 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-multus-cni-dir\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:39.968935 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.968932 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-host-run-k8s-cni-cncf-io\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:39.969265 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.968960 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-run\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.969265 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.968985 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/cd617d7e-1f50-481b-ab3f-e9a32bb0d538-device-dir\") pod \"aws-ebs-csi-driver-node-t5bxv\" (UID: \"cd617d7e-1f50-481b-ab3f-e9a32bb0d538\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t5bxv" Apr 16 10:04:39.969265 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.969013 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-ovn-node-metrics-cert\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:39.969265 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.969036 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnz52\" (UniqueName: \"kubernetes.io/projected/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-kube-api-access-cnz52\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:39.969265 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.969063 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vfbds\" (UniqueName: \"kubernetes.io/projected/dc3554db-b5f4-43be-8180-e48573e27cb6-kube-api-access-vfbds\") pod \"multus-additional-cni-plugins-fpw2w\" (UID: \"dc3554db-b5f4-43be-8180-e48573e27cb6\") " pod="openshift-multus/multus-additional-cni-plugins-fpw2w" Apr 16 10:04:39.969265 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.969090 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8xbtl\" (UniqueName: \"kubernetes.io/projected/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-kube-api-access-8xbtl\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.969265 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.969109 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3912faf1-da9e-41de-8d4c-b2cdb354f252-host\") pod \"node-ca-hhqrb\" (UID: \"3912faf1-da9e-41de-8d4c-b2cdb354f252\") " pod="openshift-image-registry/node-ca-hhqrb" Apr 16 10:04:39.969265 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.969118 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-host-slash\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:39.969265 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.969156 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-host-cni-bin\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:39.969265 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.969182 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-host-cni-netd\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:39.969265 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.969205 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-host-run-netns\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:39.969265 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.969239 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-multus-conf-dir\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:39.969788 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.969279 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dc3554db-b5f4-43be-8180-e48573e27cb6-cnibin\") pod \"multus-additional-cni-plugins-fpw2w\" (UID: \"dc3554db-b5f4-43be-8180-e48573e27cb6\") " pod="openshift-multus/multus-additional-cni-plugins-fpw2w" Apr 16 10:04:39.969788 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.969307 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/dc3554db-b5f4-43be-8180-e48573e27cb6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fpw2w\" (UID: \"dc3554db-b5f4-43be-8180-e48573e27cb6\") " pod="openshift-multus/multus-additional-cni-plugins-fpw2w" Apr 16 10:04:39.969788 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.969335 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kfsgt\" (UniqueName: \"kubernetes.io/projected/3912faf1-da9e-41de-8d4c-b2cdb354f252-kube-api-access-kfsgt\") pod \"node-ca-hhqrb\" (UID: \"3912faf1-da9e-41de-8d4c-b2cdb354f252\") " pod="openshift-image-registry/node-ca-hhqrb" Apr 16 10:04:39.969788 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.969361 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-etc-kubernetes\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.969788 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.969406 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-tmp\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.969788 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.969447 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-system-cni-dir\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:39.969788 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.969477 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6wwd\" (UniqueName: \"kubernetes.io/projected/4739af65-2ca2-4e8f-9b0e-ddeac76a9b66-kube-api-access-c6wwd\") pod \"network-metrics-daemon-mczdx\" (UID: \"4739af65-2ca2-4e8f-9b0e-ddeac76a9b66\") " pod="openshift-multus/network-metrics-daemon-mczdx" Apr 16 10:04:39.969788 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.969504 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9c0ede08-fd0f-4aba-9ec1-e44fb8d11bd4-hosts-file\") pod \"node-resolver-scbdk\" (UID: \"9c0ede08-fd0f-4aba-9ec1-e44fb8d11bd4\") " pod="openshift-dns/node-resolver-scbdk" Apr 16 10:04:39.969788 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.969546 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dc3554db-b5f4-43be-8180-e48573e27cb6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fpw2w\" (UID: \"dc3554db-b5f4-43be-8180-e48573e27cb6\") " pod="openshift-multus/multus-additional-cni-plugins-fpw2w" Apr 16 10:04:39.969788 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.969498 2525 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 10:04:39.969788 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.969576 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-etc-modprobe-d\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.969788 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.969609 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/eb2d0ec7-6730-4604-8528-e4a98ff4857d-iptables-alerter-script\") pod \"iptables-alerter-htz8x\" (UID: \"eb2d0ec7-6730-4604-8528-e4a98ff4857d\") " pod="openshift-network-operator/iptables-alerter-htz8x" Apr 16 10:04:39.969788 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.969640 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-host-run-ovn-kubernetes\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:39.969788 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.969692 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/27405ca4-ee70-4403-a147-625dbadaa808-cni-binary-copy\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:39.969788 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.969695 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-run\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.969788 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.969733 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dc3554db-b5f4-43be-8180-e48573e27cb6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fpw2w\" (UID: \"dc3554db-b5f4-43be-8180-e48573e27cb6\") " pod="openshift-multus/multus-additional-cni-plugins-fpw2w" Apr 16 10:04:39.969788 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.968802 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-host\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.970482 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.969761 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-var-lib-kubelet\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.970482 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.969753 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/cd617d7e-1f50-481b-ab3f-e9a32bb0d538-device-dir\") pod \"aws-ebs-csi-driver-node-t5bxv\" (UID: \"cd617d7e-1f50-481b-ab3f-e9a32bb0d538\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t5bxv" Apr 16 10:04:39.970482 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.969787 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cd617d7e-1f50-481b-ab3f-e9a32bb0d538-registration-dir\") pod \"aws-ebs-csi-driver-node-t5bxv\" (UID: \"cd617d7e-1f50-481b-ab3f-e9a32bb0d538\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t5bxv" Apr 16 10:04:39.970482 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.969935 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/cd617d7e-1f50-481b-ab3f-e9a32bb0d538-etc-selinux\") pod \"aws-ebs-csi-driver-node-t5bxv\" (UID: \"cd617d7e-1f50-481b-ab3f-e9a32bb0d538\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t5bxv" Apr 16 10:04:39.970482 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.969935 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-var-lib-kubelet\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.970482 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.969987 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-etc-kubernetes\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.970482 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.970261 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9c0ede08-fd0f-4aba-9ec1-e44fb8d11bd4-hosts-file\") pod \"node-resolver-scbdk\" (UID: \"9c0ede08-fd0f-4aba-9ec1-e44fb8d11bd4\") " pod="openshift-dns/node-resolver-scbdk" Apr 16 10:04:39.970482 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.970306 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-run-openvswitch\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:39.970482 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.970334 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-host-var-lib-cni-bin\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:39.970482 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.970363 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-etc-sysconfig\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.970482 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.970387 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-lib-modules\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.970482 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.970408 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eb2d0ec7-6730-4604-8528-e4a98ff4857d-host-slash\") pod \"iptables-alerter-htz8x\" (UID: \"eb2d0ec7-6730-4604-8528-e4a98ff4857d\") " pod="openshift-network-operator/iptables-alerter-htz8x" Apr 16 10:04:39.970482 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.970429 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-run-systemd\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:39.970482 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.970453 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-etc-openvswitch\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:39.971085 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.970490 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/eb2d0ec7-6730-4604-8528-e4a98ff4857d-iptables-alerter-script\") pod \"iptables-alerter-htz8x\" (UID: \"eb2d0ec7-6730-4604-8528-e4a98ff4857d\") " pod="openshift-network-operator/iptables-alerter-htz8x" Apr 16 10:04:39.971085 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.970550 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:39.971085 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.970575 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-env-overrides\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:39.971085 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.970598 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crb77\" (UniqueName: \"kubernetes.io/projected/9c0ede08-fd0f-4aba-9ec1-e44fb8d11bd4-kube-api-access-crb77\") pod \"node-resolver-scbdk\" (UID: \"9c0ede08-fd0f-4aba-9ec1-e44fb8d11bd4\") " pod="openshift-dns/node-resolver-scbdk" Apr 16 10:04:39.971085 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.970627 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dc3554db-b5f4-43be-8180-e48573e27cb6-os-release\") pod \"multus-additional-cni-plugins-fpw2w\" (UID: \"dc3554db-b5f4-43be-8180-e48573e27cb6\") " pod="openshift-multus/multus-additional-cni-plugins-fpw2w" Apr 16 10:04:39.971085 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.970680 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztrhm\" (UniqueName: \"kubernetes.io/projected/19d87478-ac42-469f-a6d5-8ca391f485f6-kube-api-access-ztrhm\") pod \"network-check-target-nz7jd\" (UID: \"19d87478-ac42-469f-a6d5-8ca391f485f6\") " pod="openshift-network-diagnostics/network-check-target-nz7jd" Apr 16 10:04:39.971085 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.970711 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/cd617d7e-1f50-481b-ab3f-e9a32bb0d538-sys-fs\") pod \"aws-ebs-csi-driver-node-t5bxv\" (UID: \"cd617d7e-1f50-481b-ab3f-e9a32bb0d538\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t5bxv" Apr 16 10:04:39.971085 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.970737 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-host-kubelet\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:39.971085 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.970763 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-host-var-lib-cni-multus\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:39.971085 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.970790 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-host-var-lib-kubelet\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:39.971085 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.970815 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-hostroot\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:39.971085 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.970844 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dc3554db-b5f4-43be-8180-e48573e27cb6-system-cni-dir\") pod \"multus-additional-cni-plugins-fpw2w\" (UID: \"dc3554db-b5f4-43be-8180-e48573e27cb6\") " pod="openshift-multus/multus-additional-cni-plugins-fpw2w" Apr 16 10:04:39.971085 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.970868 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-sys\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.971085 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.970894 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcn69\" (UniqueName: \"kubernetes.io/projected/cd617d7e-1f50-481b-ab3f-e9a32bb0d538-kube-api-access-kcn69\") pod \"aws-ebs-csi-driver-node-t5bxv\" (UID: \"cd617d7e-1f50-481b-ab3f-e9a32bb0d538\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t5bxv" Apr 16 10:04:39.971085 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.970926 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-systemd-units\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:39.971085 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.971033 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-node-log\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:39.971829 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.971044 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dc3554db-b5f4-43be-8180-e48573e27cb6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fpw2w\" (UID: \"dc3554db-b5f4-43be-8180-e48573e27cb6\") " pod="openshift-multus/multus-additional-cni-plugins-fpw2w" Apr 16 10:04:39.971829 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.971058 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-etc-kubernetes\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:39.971829 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.971082 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3912faf1-da9e-41de-8d4c-b2cdb354f252-serviceca\") pod \"node-ca-hhqrb\" (UID: \"3912faf1-da9e-41de-8d4c-b2cdb354f252\") " pod="openshift-image-registry/node-ca-hhqrb" Apr 16 10:04:39.971829 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.971105 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r7t9c\" (UniqueName: \"kubernetes.io/projected/eb2d0ec7-6730-4604-8528-e4a98ff4857d-kube-api-access-r7t9c\") pod \"iptables-alerter-htz8x\" (UID: \"eb2d0ec7-6730-4604-8528-e4a98ff4857d\") " pod="openshift-network-operator/iptables-alerter-htz8x" Apr 16 10:04:39.971829 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.971145 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-etc-modprobe-d\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.971829 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.971153 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-cnibin\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:39.971829 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.971175 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-os-release\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:39.971829 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.971196 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cd617d7e-1f50-481b-ab3f-e9a32bb0d538-registration-dir\") pod \"aws-ebs-csi-driver-node-t5bxv\" (UID: \"cd617d7e-1f50-481b-ab3f-e9a32bb0d538\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t5bxv" Apr 16 10:04:39.971829 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.971200 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-host-run-multus-certs\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:39.971829 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.971233 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cd617d7e-1f50-481b-ab3f-e9a32bb0d538-socket-dir\") pod \"aws-ebs-csi-driver-node-t5bxv\" (UID: \"cd617d7e-1f50-481b-ab3f-e9a32bb0d538\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t5bxv" Apr 16 10:04:39.971829 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.971261 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-ovnkube-config\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:39.971829 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.971287 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-ovnkube-script-lib\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:39.971829 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.971294 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-etc-sysconfig\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.971829 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.971315 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4739af65-2ca2-4e8f-9b0e-ddeac76a9b66-metrics-certs\") pod \"network-metrics-daemon-mczdx\" (UID: \"4739af65-2ca2-4e8f-9b0e-ddeac76a9b66\") " pod="openshift-multus/network-metrics-daemon-mczdx" Apr 16 10:04:39.971829 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.971344 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-etc-sysctl-d\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.971829 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.971371 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-etc-systemd\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.971829 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.971383 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-lib-modules\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.972619 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.971399 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmb9h\" (UniqueName: \"kubernetes.io/projected/27405ca4-ee70-4403-a147-625dbadaa808-kube-api-access-nmb9h\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:39.972619 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.971411 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eb2d0ec7-6730-4604-8528-e4a98ff4857d-host-slash\") pod \"iptables-alerter-htz8x\" (UID: \"eb2d0ec7-6730-4604-8528-e4a98ff4857d\") " pod="openshift-network-operator/iptables-alerter-htz8x" Apr 16 10:04:39.972619 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.971591 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cd617d7e-1f50-481b-ab3f-e9a32bb0d538-socket-dir\") pod \"aws-ebs-csi-driver-node-t5bxv\" (UID: \"cd617d7e-1f50-481b-ab3f-e9a32bb0d538\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t5bxv" Apr 16 10:04:39.972619 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.971712 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-etc-sysctl-d\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.972619 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.971765 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-etc-systemd\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.972619 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.971801 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dc3554db-b5f4-43be-8180-e48573e27cb6-cnibin\") pod \"multus-additional-cni-plugins-fpw2w\" (UID: \"dc3554db-b5f4-43be-8180-e48573e27cb6\") " pod="openshift-multus/multus-additional-cni-plugins-fpw2w" Apr 16 10:04:39.972619 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.972147 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/34e490e3-17d2-4fa4-ba1d-76247d80e91c-agent-certs\") pod \"konnectivity-agent-pwsbd\" (UID: \"34e490e3-17d2-4fa4-ba1d-76247d80e91c\") " pod="kube-system/konnectivity-agent-pwsbd" Apr 16 10:04:39.972619 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.972186 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/34e490e3-17d2-4fa4-ba1d-76247d80e91c-konnectivity-ca\") pod \"konnectivity-agent-pwsbd\" (UID: \"34e490e3-17d2-4fa4-ba1d-76247d80e91c\") " pod="kube-system/konnectivity-agent-pwsbd" Apr 16 10:04:39.972619 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.972217 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cd617d7e-1f50-481b-ab3f-e9a32bb0d538-kubelet-dir\") pod \"aws-ebs-csi-driver-node-t5bxv\" (UID: \"cd617d7e-1f50-481b-ab3f-e9a32bb0d538\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t5bxv" Apr 16 10:04:39.972619 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.972276 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-log-socket\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:39.972619 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.972305 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-multus-socket-dir-parent\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:39.972619 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.972334 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/27405ca4-ee70-4403-a147-625dbadaa808-multus-daemon-config\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:39.972619 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.972364 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9c0ede08-fd0f-4aba-9ec1-e44fb8d11bd4-tmp-dir\") pod \"node-resolver-scbdk\" (UID: \"9c0ede08-fd0f-4aba-9ec1-e44fb8d11bd4\") " pod="openshift-dns/node-resolver-scbdk" Apr 16 10:04:39.972619 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.972392 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dc3554db-b5f4-43be-8180-e48573e27cb6-cni-binary-copy\") pod \"multus-additional-cni-plugins-fpw2w\" (UID: \"dc3554db-b5f4-43be-8180-e48573e27cb6\") " pod="openshift-multus/multus-additional-cni-plugins-fpw2w" Apr 16 10:04:39.972619 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.972422 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-etc-sysctl-conf\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.972619 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.972596 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-etc-sysctl-conf\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.973319 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.972645 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cd617d7e-1f50-481b-ab3f-e9a32bb0d538-kubelet-dir\") pod \"aws-ebs-csi-driver-node-t5bxv\" (UID: \"cd617d7e-1f50-481b-ab3f-e9a32bb0d538\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t5bxv" Apr 16 10:04:39.973319 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.972879 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dc3554db-b5f4-43be-8180-e48573e27cb6-os-release\") pod \"multus-additional-cni-plugins-fpw2w\" (UID: \"dc3554db-b5f4-43be-8180-e48573e27cb6\") " pod="openshift-multus/multus-additional-cni-plugins-fpw2w" Apr 16 10:04:39.973319 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.972968 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/cd617d7e-1f50-481b-ab3f-e9a32bb0d538-sys-fs\") pod \"aws-ebs-csi-driver-node-t5bxv\" (UID: \"cd617d7e-1f50-481b-ab3f-e9a32bb0d538\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t5bxv" Apr 16 10:04:39.973319 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.973222 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/dc3554db-b5f4-43be-8180-e48573e27cb6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fpw2w\" (UID: \"dc3554db-b5f4-43be-8180-e48573e27cb6\") " pod="openshift-multus/multus-additional-cni-plugins-fpw2w" Apr 16 10:04:39.973503 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.973334 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9c0ede08-fd0f-4aba-9ec1-e44fb8d11bd4-tmp-dir\") pod \"node-resolver-scbdk\" (UID: \"9c0ede08-fd0f-4aba-9ec1-e44fb8d11bd4\") " pod="openshift-dns/node-resolver-scbdk" Apr 16 10:04:39.973573 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.973545 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-etc-tuned\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.973573 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.973545 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dc3554db-b5f4-43be-8180-e48573e27cb6-cni-binary-copy\") pod \"multus-additional-cni-plugins-fpw2w\" (UID: \"dc3554db-b5f4-43be-8180-e48573e27cb6\") " pod="openshift-multus/multus-additional-cni-plugins-fpw2w" Apr 16 10:04:39.973670 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.973581 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-sys\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.973670 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.973615 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dc3554db-b5f4-43be-8180-e48573e27cb6-system-cni-dir\") pod \"multus-additional-cni-plugins-fpw2w\" (UID: \"dc3554db-b5f4-43be-8180-e48573e27cb6\") " pod="openshift-multus/multus-additional-cni-plugins-fpw2w" Apr 16 10:04:39.973670 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.973622 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-tmp\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.974597 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.974551 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dc3554db-b5f4-43be-8180-e48573e27cb6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fpw2w\" (UID: \"dc3554db-b5f4-43be-8180-e48573e27cb6\") " pod="openshift-multus/multus-additional-cni-plugins-fpw2w" Apr 16 10:04:39.977362 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.975124 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/34e490e3-17d2-4fa4-ba1d-76247d80e91c-konnectivity-ca\") pod \"konnectivity-agent-pwsbd\" (UID: \"34e490e3-17d2-4fa4-ba1d-76247d80e91c\") " pod="kube-system/konnectivity-agent-pwsbd" Apr 16 10:04:39.977362 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.975528 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3912faf1-da9e-41de-8d4c-b2cdb354f252-serviceca\") pod \"node-ca-hhqrb\" (UID: \"3912faf1-da9e-41de-8d4c-b2cdb354f252\") " pod="openshift-image-registry/node-ca-hhqrb" Apr 16 10:04:39.980520 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.979413 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/34e490e3-17d2-4fa4-ba1d-76247d80e91c-agent-certs\") pod \"konnectivity-agent-pwsbd\" (UID: \"34e490e3-17d2-4fa4-ba1d-76247d80e91c\") " pod="kube-system/konnectivity-agent-pwsbd" Apr 16 10:04:39.980520 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.979575 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfbds\" (UniqueName: \"kubernetes.io/projected/dc3554db-b5f4-43be-8180-e48573e27cb6-kube-api-access-vfbds\") pod \"multus-additional-cni-plugins-fpw2w\" (UID: \"dc3554db-b5f4-43be-8180-e48573e27cb6\") " pod="openshift-multus/multus-additional-cni-plugins-fpw2w" Apr 16 10:04:39.980520 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:39.979609 2525 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 10:04:39.980520 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:39.979633 2525 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 10:04:39.980520 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:39.979647 2525 projected.go:194] Error preparing data for projected volume kube-api-access-ztrhm for pod openshift-network-diagnostics/network-check-target-nz7jd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 10:04:39.980520 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:39.979753 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/19d87478-ac42-469f-a6d5-8ca391f485f6-kube-api-access-ztrhm podName:19d87478-ac42-469f-a6d5-8ca391f485f6 nodeName:}" failed. No retries permitted until 2026-04-16 10:04:40.479709457 +0000 UTC m=+3.079157410 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ztrhm" (UniqueName: "kubernetes.io/projected/19d87478-ac42-469f-a6d5-8ca391f485f6-kube-api-access-ztrhm") pod "network-check-target-nz7jd" (UID: "19d87478-ac42-469f-a6d5-8ca391f485f6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 10:04:39.980867 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.980851 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xbtl\" (UniqueName: \"kubernetes.io/projected/0d5f6a61-ffda-4acb-83ec-7d961f2ef458-kube-api-access-8xbtl\") pod \"tuned-mfrrt\" (UID: \"0d5f6a61-ffda-4acb-83ec-7d961f2ef458\") " pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:39.981816 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.981368 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfsgt\" (UniqueName: \"kubernetes.io/projected/3912faf1-da9e-41de-8d4c-b2cdb354f252-kube-api-access-kfsgt\") pod \"node-ca-hhqrb\" (UID: \"3912faf1-da9e-41de-8d4c-b2cdb354f252\") " pod="openshift-image-registry/node-ca-hhqrb" Apr 16 10:04:39.982402 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.982105 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcn69\" (UniqueName: \"kubernetes.io/projected/cd617d7e-1f50-481b-ab3f-e9a32bb0d538-kube-api-access-kcn69\") pod \"aws-ebs-csi-driver-node-t5bxv\" (UID: \"cd617d7e-1f50-481b-ab3f-e9a32bb0d538\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t5bxv" Apr 16 10:04:39.983144 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.983120 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7t9c\" (UniqueName: \"kubernetes.io/projected/eb2d0ec7-6730-4604-8528-e4a98ff4857d-kube-api-access-r7t9c\") pod \"iptables-alerter-htz8x\" (UID: \"eb2d0ec7-6730-4604-8528-e4a98ff4857d\") " pod="openshift-network-operator/iptables-alerter-htz8x" Apr 16 10:04:39.983486 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:39.983419 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-crb77\" (UniqueName: \"kubernetes.io/projected/9c0ede08-fd0f-4aba-9ec1-e44fb8d11bd4-kube-api-access-crb77\") pod \"node-resolver-scbdk\" (UID: \"9c0ede08-fd0f-4aba-9ec1-e44fb8d11bd4\") " pod="openshift-dns/node-resolver-scbdk" Apr 16 10:04:40.073239 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.073155 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-log-socket\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.073239 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.073199 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-multus-socket-dir-parent\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:40.073239 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.073232 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-log-socket\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.073239 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.073223 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/27405ca4-ee70-4403-a147-625dbadaa808-multus-daemon-config\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:40.073553 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.073299 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-host-run-netns\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.073553 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.073314 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-multus-socket-dir-parent\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:40.073553 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.073327 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-var-lib-openvswitch\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.073553 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.073357 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-run-ovn\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.073553 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.073359 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-host-run-netns\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.073553 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.073380 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-multus-cni-dir\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:40.073553 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.073399 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-var-lib-openvswitch\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.073553 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.073411 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-host-run-k8s-cni-cncf-io\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:40.073553 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.073434 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-run-ovn\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.073553 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.073438 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-ovn-node-metrics-cert\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.073553 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.073468 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cnz52\" (UniqueName: \"kubernetes.io/projected/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-kube-api-access-cnz52\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.073553 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.073496 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-host-slash\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.073553 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.073536 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-host-cni-bin\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.073553 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.073558 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-host-cni-netd\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.074064 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.073583 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-host-run-netns\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:40.074064 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.073605 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-multus-conf-dir\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:40.074064 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.073632 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-system-cni-dir\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:40.074064 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.073661 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c6wwd\" (UniqueName: \"kubernetes.io/projected/4739af65-2ca2-4e8f-9b0e-ddeac76a9b66-kube-api-access-c6wwd\") pod \"network-metrics-daemon-mczdx\" (UID: \"4739af65-2ca2-4e8f-9b0e-ddeac76a9b66\") " pod="openshift-multus/network-metrics-daemon-mczdx" Apr 16 10:04:40.074064 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.073690 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-host-run-ovn-kubernetes\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.074064 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.073715 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/27405ca4-ee70-4403-a147-625dbadaa808-cni-binary-copy\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:40.074064 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.073743 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-run-openvswitch\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.074064 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.073769 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-host-var-lib-cni-bin\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:40.074064 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.073770 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-host-run-netns\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:40.074064 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.073802 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-run-systemd\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.074064 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.073819 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/27405ca4-ee70-4403-a147-625dbadaa808-multus-daemon-config\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:40.074064 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.073825 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-etc-openvswitch\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.074064 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.073850 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.074064 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.073906 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-run-openvswitch\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.074064 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.073906 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-system-cni-dir\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:40.074064 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.073928 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-env-overrides\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.074064 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.073954 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-host-var-lib-cni-bin\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:40.074064 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.073974 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-host-kubelet\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.074784 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.073999 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-host-var-lib-cni-multus\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:40.074784 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.074000 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-run-systemd\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.074784 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.074033 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-etc-openvswitch\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.074784 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.074048 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-multus-conf-dir\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:40.074784 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.074059 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-host-var-lib-kubelet\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:40.074784 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.074085 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-hostroot\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:40.074784 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.074097 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-host-kubelet\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.074784 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.074113 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-host-slash\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.074784 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.074131 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-host-var-lib-cni-multus\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:40.074784 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.074112 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-systemd-units\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.074784 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.074143 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-systemd-units\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.074784 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.074161 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-node-log\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.074784 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.074166 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-host-cni-bin\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.074784 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.074178 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-host-var-lib-kubelet\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:40.074784 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.074185 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-etc-kubernetes\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:40.074784 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.074212 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-cnibin\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:40.074784 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.074215 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-host-cni-netd\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.074784 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.074215 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-hostroot\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:40.075416 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.074247 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-etc-kubernetes\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:40.075416 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.074247 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/27405ca4-ee70-4403-a147-625dbadaa808-cni-binary-copy\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:40.075416 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.074271 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-multus-cni-dir\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:40.075416 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.074301 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-node-log\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.075416 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.074304 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.075416 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.074322 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-os-release\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:40.075416 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.074326 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-host-run-k8s-cni-cncf-io\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:40.075416 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.074345 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-host-run-multus-certs\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:40.075416 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.074370 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-ovnkube-config\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.075416 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.074368 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-host-run-ovn-kubernetes\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.075416 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.074396 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-ovnkube-script-lib\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.075416 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.074400 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-os-release\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:40.075416 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.074432 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4739af65-2ca2-4e8f-9b0e-ddeac76a9b66-metrics-certs\") pod \"network-metrics-daemon-mczdx\" (UID: \"4739af65-2ca2-4e8f-9b0e-ddeac76a9b66\") " pod="openshift-multus/network-metrics-daemon-mczdx" Apr 16 10:04:40.075416 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.074445 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-host-run-multus-certs\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:40.075416 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.074467 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nmb9h\" (UniqueName: \"kubernetes.io/projected/27405ca4-ee70-4403-a147-625dbadaa808-kube-api-access-nmb9h\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:40.075416 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:40.074597 2525 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 10:04:40.075416 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.074648 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/27405ca4-ee70-4403-a147-625dbadaa808-cnibin\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:40.075416 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:40.074672 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4739af65-2ca2-4e8f-9b0e-ddeac76a9b66-metrics-certs podName:4739af65-2ca2-4e8f-9b0e-ddeac76a9b66 nodeName:}" failed. No retries permitted until 2026-04-16 10:04:40.574653725 +0000 UTC m=+3.174101660 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4739af65-2ca2-4e8f-9b0e-ddeac76a9b66-metrics-certs") pod "network-metrics-daemon-mczdx" (UID: "4739af65-2ca2-4e8f-9b0e-ddeac76a9b66") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 10:04:40.076086 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.074693 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-env-overrides\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.076086 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.074944 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-ovnkube-script-lib\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.076086 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.075138 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-ovnkube-config\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.076307 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.076277 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-ovn-node-metrics-cert\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.083132 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.083111 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmb9h\" (UniqueName: \"kubernetes.io/projected/27405ca4-ee70-4403-a147-625dbadaa808-kube-api-access-nmb9h\") pod \"multus-pmkq5\" (UID: \"27405ca4-ee70-4403-a147-625dbadaa808\") " pod="openshift-multus/multus-pmkq5" Apr 16 10:04:40.083280 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.083263 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6wwd\" (UniqueName: \"kubernetes.io/projected/4739af65-2ca2-4e8f-9b0e-ddeac76a9b66-kube-api-access-c6wwd\") pod \"network-metrics-daemon-mczdx\" (UID: \"4739af65-2ca2-4e8f-9b0e-ddeac76a9b66\") " pod="openshift-multus/network-metrics-daemon-mczdx" Apr 16 10:04:40.083321 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.083281 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnz52\" (UniqueName: \"kubernetes.io/projected/2bdec43e-45b2-4aa1-8605-fa162a8fc58b-kube-api-access-cnz52\") pod \"ovnkube-node-rk78v\" (UID: \"2bdec43e-45b2-4aa1-8605-fa162a8fc58b\") " pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.157280 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.157243 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hhqrb" Apr 16 10:04:40.164178 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.164151 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-scbdk" Apr 16 10:04:40.174922 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.174885 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fpw2w" Apr 16 10:04:40.180599 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.180576 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-htz8x" Apr 16 10:04:40.189293 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.189269 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t5bxv" Apr 16 10:04:40.196960 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.196938 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" Apr 16 10:04:40.205505 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.205486 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-pwsbd" Apr 16 10:04:40.213077 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.213057 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pmkq5" Apr 16 10:04:40.218632 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.218613 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:04:40.446533 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:40.446492 2525 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27405ca4_ee70_4403_a147_625dbadaa808.slice/crio-f6459a55b2fdb169a344dcc0527af629ebd493a4bfd548f49233313040621e85 WatchSource:0}: Error finding container f6459a55b2fdb169a344dcc0527af629ebd493a4bfd548f49233313040621e85: Status 404 returned error can't find the container with id f6459a55b2fdb169a344dcc0527af629ebd493a4bfd548f49233313040621e85 Apr 16 10:04:40.447791 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:40.447762 2525 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb2d0ec7_6730_4604_8528_e4a98ff4857d.slice/crio-e0dc0987b83ee726ef8a6d4f51273267d53fadd482b5dc640939c1a6093abb7c WatchSource:0}: Error finding container e0dc0987b83ee726ef8a6d4f51273267d53fadd482b5dc640939c1a6093abb7c: Status 404 returned error can't find the container with id e0dc0987b83ee726ef8a6d4f51273267d53fadd482b5dc640939c1a6093abb7c Apr 16 10:04:40.448567 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:40.448531 2525 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c0ede08_fd0f_4aba_9ec1_e44fb8d11bd4.slice/crio-c61cc5e4a6db065b42c45207a396903bfbf70650a1024a6ce316ec667f63420f WatchSource:0}: Error finding container c61cc5e4a6db065b42c45207a396903bfbf70650a1024a6ce316ec667f63420f: Status 404 returned error can't find the container with id c61cc5e4a6db065b42c45207a396903bfbf70650a1024a6ce316ec667f63420f Apr 16 10:04:40.449445 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:40.449351 2525 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bdec43e_45b2_4aa1_8605_fa162a8fc58b.slice/crio-1c5fb2e14cec2be436090cdfe98b8d570c3957304cd14d0356f0b58d56512296 WatchSource:0}: Error finding container 1c5fb2e14cec2be436090cdfe98b8d570c3957304cd14d0356f0b58d56512296: Status 404 returned error can't find the container with id 1c5fb2e14cec2be436090cdfe98b8d570c3957304cd14d0356f0b58d56512296 Apr 16 10:04:40.452946 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:40.452833 2525 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd617d7e_1f50_481b_ab3f_e9a32bb0d538.slice/crio-7a458cff3d32f9c437e832d016ee38c87dadefb4e57394cc88c7e4b67a28334a WatchSource:0}: Error finding container 7a458cff3d32f9c437e832d016ee38c87dadefb4e57394cc88c7e4b67a28334a: Status 404 returned error can't find the container with id 7a458cff3d32f9c437e832d016ee38c87dadefb4e57394cc88c7e4b67a28334a Apr 16 10:04:40.453952 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:40.453929 2525 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34e490e3_17d2_4fa4_ba1d_76247d80e91c.slice/crio-a65fa8eec06712fe726e2b9bf62cc62f794def22d75c934c4f700df79fbb87be WatchSource:0}: Error finding container a65fa8eec06712fe726e2b9bf62cc62f794def22d75c934c4f700df79fbb87be: Status 404 returned error can't find the container with id a65fa8eec06712fe726e2b9bf62cc62f794def22d75c934c4f700df79fbb87be Apr 16 10:04:40.454429 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:40.454405 2525 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d5f6a61_ffda_4acb_83ec_7d961f2ef458.slice/crio-db5a0cf409f2e083d011c814a30e056e58b74be891c0e4819bb97c9d221a048e WatchSource:0}: Error finding container db5a0cf409f2e083d011c814a30e056e58b74be891c0e4819bb97c9d221a048e: Status 404 returned error can't find the container with id db5a0cf409f2e083d011c814a30e056e58b74be891c0e4819bb97c9d221a048e Apr 16 10:04:40.455784 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:40.455760 2525 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3912faf1_da9e_41de_8d4c_b2cdb354f252.slice/crio-fb1007421e6e5b710b21c99ea9414e0cc1352f63b2c7b4f5ead3cda4ce83822b WatchSource:0}: Error finding container fb1007421e6e5b710b21c99ea9414e0cc1352f63b2c7b4f5ead3cda4ce83822b: Status 404 returned error can't find the container with id fb1007421e6e5b710b21c99ea9414e0cc1352f63b2c7b4f5ead3cda4ce83822b Apr 16 10:04:40.457465 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:04:40.457440 2525 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc3554db_b5f4_43be_8180_e48573e27cb6.slice/crio-eda8fe7478608ace2e02680d5cd89d23087fe9a7f29b3735e96e16052610a2b9 WatchSource:0}: Error finding container eda8fe7478608ace2e02680d5cd89d23087fe9a7f29b3735e96e16052610a2b9: Status 404 returned error can't find the container with id eda8fe7478608ace2e02680d5cd89d23087fe9a7f29b3735e96e16052610a2b9 Apr 16 10:04:40.578132 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.577942 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztrhm\" (UniqueName: \"kubernetes.io/projected/19d87478-ac42-469f-a6d5-8ca391f485f6-kube-api-access-ztrhm\") pod \"network-check-target-nz7jd\" (UID: \"19d87478-ac42-469f-a6d5-8ca391f485f6\") " pod="openshift-network-diagnostics/network-check-target-nz7jd" Apr 16 10:04:40.578338 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.578161 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4739af65-2ca2-4e8f-9b0e-ddeac76a9b66-metrics-certs\") pod \"network-metrics-daemon-mczdx\" (UID: \"4739af65-2ca2-4e8f-9b0e-ddeac76a9b66\") " pod="openshift-multus/network-metrics-daemon-mczdx" Apr 16 10:04:40.578338 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:40.578103 2525 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 10:04:40.578338 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:40.578206 2525 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 10:04:40.578338 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:40.578216 2525 projected.go:194] Error preparing data for projected volume kube-api-access-ztrhm for pod openshift-network-diagnostics/network-check-target-nz7jd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 10:04:40.578338 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:40.578258 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/19d87478-ac42-469f-a6d5-8ca391f485f6-kube-api-access-ztrhm podName:19d87478-ac42-469f-a6d5-8ca391f485f6 nodeName:}" failed. No retries permitted until 2026-04-16 10:04:41.578244625 +0000 UTC m=+4.177692559 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ztrhm" (UniqueName: "kubernetes.io/projected/19d87478-ac42-469f-a6d5-8ca391f485f6-kube-api-access-ztrhm") pod "network-check-target-nz7jd" (UID: "19d87478-ac42-469f-a6d5-8ca391f485f6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 10:04:40.578338 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:40.578310 2525 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 10:04:40.578579 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:40.578362 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4739af65-2ca2-4e8f-9b0e-ddeac76a9b66-metrics-certs podName:4739af65-2ca2-4e8f-9b0e-ddeac76a9b66 nodeName:}" failed. No retries permitted until 2026-04-16 10:04:41.578346215 +0000 UTC m=+4.177794151 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4739af65-2ca2-4e8f-9b0e-ddeac76a9b66-metrics-certs") pod "network-metrics-daemon-mczdx" (UID: "4739af65-2ca2-4e8f-9b0e-ddeac76a9b66") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 10:04:40.896666 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.896372 2525 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 09:59:38 +0000 UTC" deadline="2027-12-28 07:20:29.766579054 +0000 UTC" Apr 16 10:04:40.896666 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.896411 2525 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14901h15m48.87017174s" Apr 16 10:04:40.988398 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.988363 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mczdx" Apr 16 10:04:40.988590 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:40.988539 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mczdx" podUID="4739af65-2ca2-4e8f-9b0e-ddeac76a9b66" Apr 16 10:04:40.999339 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:40.999210 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-1.ec2.internal" event={"ID":"b1bfdb1b2e8b28ab395d30767607393b","Type":"ContainerStarted","Data":"015522dca10c3834035c93421e24069a3cb641c5a6d9ab252812f2efb9922601"} Apr 16 10:04:41.001440 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:41.001377 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" event={"ID":"0d5f6a61-ffda-4acb-83ec-7d961f2ef458","Type":"ContainerStarted","Data":"db5a0cf409f2e083d011c814a30e056e58b74be891c0e4819bb97c9d221a048e"} Apr 16 10:04:41.003139 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:41.003088 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" event={"ID":"2bdec43e-45b2-4aa1-8605-fa162a8fc58b","Type":"ContainerStarted","Data":"1c5fb2e14cec2be436090cdfe98b8d570c3957304cd14d0356f0b58d56512296"} Apr 16 10:04:41.005190 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:41.005168 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-scbdk" event={"ID":"9c0ede08-fd0f-4aba-9ec1-e44fb8d11bd4","Type":"ContainerStarted","Data":"c61cc5e4a6db065b42c45207a396903bfbf70650a1024a6ce316ec667f63420f"} Apr 16 10:04:41.006459 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:41.006438 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-htz8x" event={"ID":"eb2d0ec7-6730-4604-8528-e4a98ff4857d","Type":"ContainerStarted","Data":"e0dc0987b83ee726ef8a6d4f51273267d53fadd482b5dc640939c1a6093abb7c"} Apr 16 10:04:41.007827 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:41.007804 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fpw2w" event={"ID":"dc3554db-b5f4-43be-8180-e48573e27cb6","Type":"ContainerStarted","Data":"eda8fe7478608ace2e02680d5cd89d23087fe9a7f29b3735e96e16052610a2b9"} Apr 16 10:04:41.009249 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:41.009219 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hhqrb" event={"ID":"3912faf1-da9e-41de-8d4c-b2cdb354f252","Type":"ContainerStarted","Data":"fb1007421e6e5b710b21c99ea9414e0cc1352f63b2c7b4f5ead3cda4ce83822b"} Apr 16 10:04:41.011874 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:41.011852 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-pwsbd" event={"ID":"34e490e3-17d2-4fa4-ba1d-76247d80e91c","Type":"ContainerStarted","Data":"a65fa8eec06712fe726e2b9bf62cc62f794def22d75c934c4f700df79fbb87be"} Apr 16 10:04:41.013837 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:41.013813 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t5bxv" event={"ID":"cd617d7e-1f50-481b-ab3f-e9a32bb0d538","Type":"ContainerStarted","Data":"7a458cff3d32f9c437e832d016ee38c87dadefb4e57394cc88c7e4b67a28334a"} Apr 16 10:04:41.018064 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:41.018037 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pmkq5" event={"ID":"27405ca4-ee70-4403-a147-625dbadaa808","Type":"ContainerStarted","Data":"f6459a55b2fdb169a344dcc0527af629ebd493a4bfd548f49233313040621e85"} Apr 16 10:04:41.586741 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:41.586701 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztrhm\" (UniqueName: \"kubernetes.io/projected/19d87478-ac42-469f-a6d5-8ca391f485f6-kube-api-access-ztrhm\") pod \"network-check-target-nz7jd\" (UID: \"19d87478-ac42-469f-a6d5-8ca391f485f6\") " pod="openshift-network-diagnostics/network-check-target-nz7jd" Apr 16 10:04:41.586939 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:41.586769 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4739af65-2ca2-4e8f-9b0e-ddeac76a9b66-metrics-certs\") pod \"network-metrics-daemon-mczdx\" (UID: \"4739af65-2ca2-4e8f-9b0e-ddeac76a9b66\") " pod="openshift-multus/network-metrics-daemon-mczdx" Apr 16 10:04:41.586939 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:41.586913 2525 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 10:04:41.587060 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:41.586974 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4739af65-2ca2-4e8f-9b0e-ddeac76a9b66-metrics-certs podName:4739af65-2ca2-4e8f-9b0e-ddeac76a9b66 nodeName:}" failed. No retries permitted until 2026-04-16 10:04:43.586955623 +0000 UTC m=+6.186403562 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4739af65-2ca2-4e8f-9b0e-ddeac76a9b66-metrics-certs") pod "network-metrics-daemon-mczdx" (UID: "4739af65-2ca2-4e8f-9b0e-ddeac76a9b66") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 10:04:41.587401 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:41.587381 2525 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 10:04:41.587475 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:41.587405 2525 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 10:04:41.587475 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:41.587418 2525 projected.go:194] Error preparing data for projected volume kube-api-access-ztrhm for pod openshift-network-diagnostics/network-check-target-nz7jd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 10:04:41.587475 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:41.587464 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/19d87478-ac42-469f-a6d5-8ca391f485f6-kube-api-access-ztrhm podName:19d87478-ac42-469f-a6d5-8ca391f485f6 nodeName:}" failed. No retries permitted until 2026-04-16 10:04:43.587448957 +0000 UTC m=+6.186896895 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-ztrhm" (UniqueName: "kubernetes.io/projected/19d87478-ac42-469f-a6d5-8ca391f485f6-kube-api-access-ztrhm") pod "network-check-target-nz7jd" (UID: "19d87478-ac42-469f-a6d5-8ca391f485f6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 10:04:41.989846 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:41.989756 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nz7jd" Apr 16 10:04:41.990285 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:41.989885 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nz7jd" podUID="19d87478-ac42-469f-a6d5-8ca391f485f6" Apr 16 10:04:42.044581 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:42.043796 2525 generic.go:358] "Generic (PLEG): container finished" podID="d58cb40e2271cf732604ae518ffd1966" containerID="ad675431c53e212ad9cc5adceb7c67c5fbc6dd7e4fef74efb7219b107511113b" exitCode=0 Apr 16 10:04:42.044857 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:42.044769 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-1.ec2.internal" event={"ID":"d58cb40e2271cf732604ae518ffd1966","Type":"ContainerDied","Data":"ad675431c53e212ad9cc5adceb7c67c5fbc6dd7e4fef74efb7219b107511113b"} Apr 16 10:04:42.058674 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:42.058624 2525 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-1.ec2.internal" podStartSLOduration=3.058598304 podStartE2EDuration="3.058598304s" podCreationTimestamp="2026-04-16 10:04:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 10:04:41.011308214 +0000 UTC m=+3.610756173" watchObservedRunningTime="2026-04-16 10:04:42.058598304 +0000 UTC m=+4.658046263" Apr 16 10:04:42.988937 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:42.988904 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mczdx" Apr 16 10:04:42.989115 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:42.989048 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mczdx" podUID="4739af65-2ca2-4e8f-9b0e-ddeac76a9b66" Apr 16 10:04:43.055711 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:43.054993 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-1.ec2.internal" event={"ID":"d58cb40e2271cf732604ae518ffd1966","Type":"ContainerStarted","Data":"5eb470639721b8477c28214baa69e56b15c94af66e4dfd4167b2285eb38718e0"} Apr 16 10:04:43.068110 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:43.068063 2525 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-1.ec2.internal" podStartSLOduration=4.068042648 podStartE2EDuration="4.068042648s" podCreationTimestamp="2026-04-16 10:04:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 10:04:43.067231222 +0000 UTC m=+5.666679179" watchObservedRunningTime="2026-04-16 10:04:43.068042648 +0000 UTC m=+5.667490603" Apr 16 10:04:43.603893 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:43.603389 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztrhm\" (UniqueName: \"kubernetes.io/projected/19d87478-ac42-469f-a6d5-8ca391f485f6-kube-api-access-ztrhm\") pod \"network-check-target-nz7jd\" (UID: \"19d87478-ac42-469f-a6d5-8ca391f485f6\") " pod="openshift-network-diagnostics/network-check-target-nz7jd" Apr 16 10:04:43.603893 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:43.603452 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4739af65-2ca2-4e8f-9b0e-ddeac76a9b66-metrics-certs\") pod \"network-metrics-daemon-mczdx\" (UID: \"4739af65-2ca2-4e8f-9b0e-ddeac76a9b66\") " pod="openshift-multus/network-metrics-daemon-mczdx" Apr 16 10:04:43.603893 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:43.603619 2525 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 10:04:43.603893 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:43.603681 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4739af65-2ca2-4e8f-9b0e-ddeac76a9b66-metrics-certs podName:4739af65-2ca2-4e8f-9b0e-ddeac76a9b66 nodeName:}" failed. No retries permitted until 2026-04-16 10:04:47.603662522 +0000 UTC m=+10.203110461 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4739af65-2ca2-4e8f-9b0e-ddeac76a9b66-metrics-certs") pod "network-metrics-daemon-mczdx" (UID: "4739af65-2ca2-4e8f-9b0e-ddeac76a9b66") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 10:04:43.607914 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:43.604966 2525 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 10:04:43.607914 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:43.604993 2525 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 10:04:43.607914 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:43.605033 2525 projected.go:194] Error preparing data for projected volume kube-api-access-ztrhm for pod openshift-network-diagnostics/network-check-target-nz7jd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 10:04:43.607914 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:43.605145 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/19d87478-ac42-469f-a6d5-8ca391f485f6-kube-api-access-ztrhm podName:19d87478-ac42-469f-a6d5-8ca391f485f6 nodeName:}" failed. No retries permitted until 2026-04-16 10:04:47.605120734 +0000 UTC m=+10.204568671 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-ztrhm" (UniqueName: "kubernetes.io/projected/19d87478-ac42-469f-a6d5-8ca391f485f6-kube-api-access-ztrhm") pod "network-check-target-nz7jd" (UID: "19d87478-ac42-469f-a6d5-8ca391f485f6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 10:04:43.988745 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:43.988661 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nz7jd" Apr 16 10:04:43.988892 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:43.988783 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nz7jd" podUID="19d87478-ac42-469f-a6d5-8ca391f485f6" Apr 16 10:04:44.988417 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:44.988385 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mczdx" Apr 16 10:04:44.988872 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:44.988534 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mczdx" podUID="4739af65-2ca2-4e8f-9b0e-ddeac76a9b66" Apr 16 10:04:45.988314 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:45.988280 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nz7jd" Apr 16 10:04:45.988477 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:45.988416 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nz7jd" podUID="19d87478-ac42-469f-a6d5-8ca391f485f6" Apr 16 10:04:46.988252 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:46.988220 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mczdx" Apr 16 10:04:46.988433 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:46.988362 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mczdx" podUID="4739af65-2ca2-4e8f-9b0e-ddeac76a9b66" Apr 16 10:04:47.638539 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:47.638489 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztrhm\" (UniqueName: \"kubernetes.io/projected/19d87478-ac42-469f-a6d5-8ca391f485f6-kube-api-access-ztrhm\") pod \"network-check-target-nz7jd\" (UID: \"19d87478-ac42-469f-a6d5-8ca391f485f6\") " pod="openshift-network-diagnostics/network-check-target-nz7jd" Apr 16 10:04:47.639010 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:47.638567 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4739af65-2ca2-4e8f-9b0e-ddeac76a9b66-metrics-certs\") pod \"network-metrics-daemon-mczdx\" (UID: \"4739af65-2ca2-4e8f-9b0e-ddeac76a9b66\") " pod="openshift-multus/network-metrics-daemon-mczdx" Apr 16 10:04:47.639010 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:47.638660 2525 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 10:04:47.639010 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:47.638679 2525 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 10:04:47.639010 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:47.638693 2525 projected.go:194] Error preparing data for projected volume kube-api-access-ztrhm for pod openshift-network-diagnostics/network-check-target-nz7jd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 10:04:47.639010 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:47.638735 2525 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 10:04:47.639010 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:47.638762 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/19d87478-ac42-469f-a6d5-8ca391f485f6-kube-api-access-ztrhm podName:19d87478-ac42-469f-a6d5-8ca391f485f6 nodeName:}" failed. No retries permitted until 2026-04-16 10:04:55.6387441 +0000 UTC m=+18.238192048 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-ztrhm" (UniqueName: "kubernetes.io/projected/19d87478-ac42-469f-a6d5-8ca391f485f6-kube-api-access-ztrhm") pod "network-check-target-nz7jd" (UID: "19d87478-ac42-469f-a6d5-8ca391f485f6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 10:04:47.639010 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:47.638803 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4739af65-2ca2-4e8f-9b0e-ddeac76a9b66-metrics-certs podName:4739af65-2ca2-4e8f-9b0e-ddeac76a9b66 nodeName:}" failed. No retries permitted until 2026-04-16 10:04:55.638786081 +0000 UTC m=+18.238234020 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4739af65-2ca2-4e8f-9b0e-ddeac76a9b66-metrics-certs") pod "network-metrics-daemon-mczdx" (UID: "4739af65-2ca2-4e8f-9b0e-ddeac76a9b66") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 10:04:47.989234 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:47.989151 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nz7jd" Apr 16 10:04:47.989417 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:47.989285 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nz7jd" podUID="19d87478-ac42-469f-a6d5-8ca391f485f6" Apr 16 10:04:48.988390 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:48.988358 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mczdx" Apr 16 10:04:48.988791 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:48.988465 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mczdx" podUID="4739af65-2ca2-4e8f-9b0e-ddeac76a9b66" Apr 16 10:04:49.988364 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:49.988285 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nz7jd" Apr 16 10:04:49.988545 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:49.988420 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nz7jd" podUID="19d87478-ac42-469f-a6d5-8ca391f485f6" Apr 16 10:04:50.988477 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:50.988441 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mczdx" Apr 16 10:04:50.988668 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:50.988581 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mczdx" podUID="4739af65-2ca2-4e8f-9b0e-ddeac76a9b66" Apr 16 10:04:51.988554 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:51.988502 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nz7jd" Apr 16 10:04:51.988714 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:51.988657 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nz7jd" podUID="19d87478-ac42-469f-a6d5-8ca391f485f6" Apr 16 10:04:52.987932 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:52.987901 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mczdx" Apr 16 10:04:52.988107 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:52.988004 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mczdx" podUID="4739af65-2ca2-4e8f-9b0e-ddeac76a9b66" Apr 16 10:04:53.988659 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:53.988623 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nz7jd" Apr 16 10:04:53.989014 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:53.988730 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nz7jd" podUID="19d87478-ac42-469f-a6d5-8ca391f485f6" Apr 16 10:04:54.987986 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:54.987948 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mczdx" Apr 16 10:04:54.988198 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:54.988105 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mczdx" podUID="4739af65-2ca2-4e8f-9b0e-ddeac76a9b66" Apr 16 10:04:55.695330 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:55.695282 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztrhm\" (UniqueName: \"kubernetes.io/projected/19d87478-ac42-469f-a6d5-8ca391f485f6-kube-api-access-ztrhm\") pod \"network-check-target-nz7jd\" (UID: \"19d87478-ac42-469f-a6d5-8ca391f485f6\") " pod="openshift-network-diagnostics/network-check-target-nz7jd" Apr 16 10:04:55.695751 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:55.695337 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4739af65-2ca2-4e8f-9b0e-ddeac76a9b66-metrics-certs\") pod \"network-metrics-daemon-mczdx\" (UID: \"4739af65-2ca2-4e8f-9b0e-ddeac76a9b66\") " pod="openshift-multus/network-metrics-daemon-mczdx" Apr 16 10:04:55.695751 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:55.695426 2525 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 10:04:55.695751 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:55.695442 2525 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 10:04:55.695751 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:55.695457 2525 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 10:04:55.695751 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:55.695469 2525 projected.go:194] Error preparing data for projected volume kube-api-access-ztrhm for pod openshift-network-diagnostics/network-check-target-nz7jd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 10:04:55.695751 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:55.695490 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4739af65-2ca2-4e8f-9b0e-ddeac76a9b66-metrics-certs podName:4739af65-2ca2-4e8f-9b0e-ddeac76a9b66 nodeName:}" failed. No retries permitted until 2026-04-16 10:05:11.695471741 +0000 UTC m=+34.294919682 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4739af65-2ca2-4e8f-9b0e-ddeac76a9b66-metrics-certs") pod "network-metrics-daemon-mczdx" (UID: "4739af65-2ca2-4e8f-9b0e-ddeac76a9b66") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 10:04:55.695751 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:55.695506 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/19d87478-ac42-469f-a6d5-8ca391f485f6-kube-api-access-ztrhm podName:19d87478-ac42-469f-a6d5-8ca391f485f6 nodeName:}" failed. No retries permitted until 2026-04-16 10:05:11.69549601 +0000 UTC m=+34.294943963 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-ztrhm" (UniqueName: "kubernetes.io/projected/19d87478-ac42-469f-a6d5-8ca391f485f6-kube-api-access-ztrhm") pod "network-check-target-nz7jd" (UID: "19d87478-ac42-469f-a6d5-8ca391f485f6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 10:04:55.988612 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:55.988536 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nz7jd" Apr 16 10:04:55.988771 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:55.988648 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nz7jd" podUID="19d87478-ac42-469f-a6d5-8ca391f485f6" Apr 16 10:04:56.988313 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:56.988276 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mczdx" Apr 16 10:04:56.988810 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:56.988426 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mczdx" podUID="4739af65-2ca2-4e8f-9b0e-ddeac76a9b66" Apr 16 10:04:57.989773 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:57.989121 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nz7jd" Apr 16 10:04:57.989773 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:57.989221 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nz7jd" podUID="19d87478-ac42-469f-a6d5-8ca391f485f6" Apr 16 10:04:58.081885 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:58.081858 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" event={"ID":"0d5f6a61-ffda-4acb-83ec-7d961f2ef458","Type":"ContainerStarted","Data":"13ec79e74bad8fa92d7dc6dd5820c10b7314bc962ca1536c4aba9e4fc09e0b22"} Apr 16 10:04:58.085223 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:58.085195 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-scbdk" event={"ID":"9c0ede08-fd0f-4aba-9ec1-e44fb8d11bd4","Type":"ContainerStarted","Data":"07e7cf1e1a4cb18ba6ffc1ca6e1ff0e52c89f38dee9617faf8cb193d75e8de2b"} Apr 16 10:04:58.087038 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:58.087016 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fpw2w" event={"ID":"dc3554db-b5f4-43be-8180-e48573e27cb6","Type":"ContainerStarted","Data":"b920200c1d99959c37f453a91f7637074363a2af771382fc88cbc639d9cf9ba2"} Apr 16 10:04:58.088474 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:58.088451 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hhqrb" event={"ID":"3912faf1-da9e-41de-8d4c-b2cdb354f252","Type":"ContainerStarted","Data":"cc4b88d776cd3cd97266cd254e8db00ca571181a1f7b6f027e28617722ac8a53"} Apr 16 10:04:58.089889 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:58.089849 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-pwsbd" event={"ID":"34e490e3-17d2-4fa4-ba1d-76247d80e91c","Type":"ContainerStarted","Data":"0f83eba62ba0132f1a4b156e2dc1f3cf98809c6c11dd2158c11e4cd260853560"} Apr 16 10:04:58.091286 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:58.091269 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t5bxv" event={"ID":"cd617d7e-1f50-481b-ab3f-e9a32bb0d538","Type":"ContainerStarted","Data":"e2ff84cef3175758378f5ccb9eed8d566bbfee39ca7d3b6cb9ef81ba0854289d"} Apr 16 10:04:58.099091 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:58.099053 2525 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-scbdk" podStartSLOduration=2.836008651 podStartE2EDuration="20.099041919s" podCreationTimestamp="2026-04-16 10:04:38 +0000 UTC" firstStartedPulling="2026-04-16 10:04:40.451022875 +0000 UTC m=+3.050470810" lastFinishedPulling="2026-04-16 10:04:57.714056143 +0000 UTC m=+20.313504078" observedRunningTime="2026-04-16 10:04:58.098713495 +0000 UTC m=+20.698161446" watchObservedRunningTime="2026-04-16 10:04:58.099041919 +0000 UTC m=+20.698489875" Apr 16 10:04:58.124794 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:58.124755 2525 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-hhqrb" podStartSLOduration=2.868953608 podStartE2EDuration="20.124743149s" podCreationTimestamp="2026-04-16 10:04:38 +0000 UTC" firstStartedPulling="2026-04-16 10:04:40.458204826 +0000 UTC m=+3.057652765" lastFinishedPulling="2026-04-16 10:04:57.713994358 +0000 UTC m=+20.313442306" observedRunningTime="2026-04-16 10:04:58.124543684 +0000 UTC m=+20.723991637" watchObservedRunningTime="2026-04-16 10:04:58.124743149 +0000 UTC m=+20.724191104" Apr 16 10:04:58.125041 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:58.125021 2525 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-pwsbd" podStartSLOduration=2.922975337 podStartE2EDuration="20.125016475s" podCreationTimestamp="2026-04-16 10:04:38 +0000 UTC" firstStartedPulling="2026-04-16 10:04:40.457045393 +0000 UTC m=+3.056493332" lastFinishedPulling="2026-04-16 10:04:57.65908652 +0000 UTC m=+20.258534470" observedRunningTime="2026-04-16 10:04:58.111237641 +0000 UTC m=+20.710685598" watchObservedRunningTime="2026-04-16 10:04:58.125016475 +0000 UTC m=+20.724464430" Apr 16 10:04:58.988292 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:58.988263 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mczdx" Apr 16 10:04:58.988459 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:58.988377 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mczdx" podUID="4739af65-2ca2-4e8f-9b0e-ddeac76a9b66" Apr 16 10:04:59.094951 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:59.094915 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-htz8x" event={"ID":"eb2d0ec7-6730-4604-8528-e4a98ff4857d","Type":"ContainerStarted","Data":"7df8a838721beab10782eef01a18dc4daa747bf9183b06c9971544d76ffae2af"} Apr 16 10:04:59.096479 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:59.096453 2525 generic.go:358] "Generic (PLEG): container finished" podID="dc3554db-b5f4-43be-8180-e48573e27cb6" containerID="b920200c1d99959c37f453a91f7637074363a2af771382fc88cbc639d9cf9ba2" exitCode=0 Apr 16 10:04:59.096624 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:59.096545 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fpw2w" event={"ID":"dc3554db-b5f4-43be-8180-e48573e27cb6","Type":"ContainerDied","Data":"b920200c1d99959c37f453a91f7637074363a2af771382fc88cbc639d9cf9ba2"} Apr 16 10:04:59.098336 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:59.098312 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pmkq5" event={"ID":"27405ca4-ee70-4403-a147-625dbadaa808","Type":"ContainerStarted","Data":"4f17748a20a43d57c58a303719ca7b8de1ae2dbe3bda907ba86a7f326f84eaa8"} Apr 16 10:04:59.101441 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:59.101404 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" event={"ID":"2bdec43e-45b2-4aa1-8605-fa162a8fc58b","Type":"ContainerStarted","Data":"4508e9d024d14efd7cfc2fdb29507837d3a66ef293a56466fbb4e933e7de851f"} Apr 16 10:04:59.101536 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:59.101449 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" event={"ID":"2bdec43e-45b2-4aa1-8605-fa162a8fc58b","Type":"ContainerStarted","Data":"a0c8a38311127ed050526176060e67f91c99aca20720009c36578d34985ee4ec"} Apr 16 10:04:59.101536 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:59.101463 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" event={"ID":"2bdec43e-45b2-4aa1-8605-fa162a8fc58b","Type":"ContainerStarted","Data":"ea5ee9b086e996a1778cd8d6ef47b82ddf473d884bd3d417e329a32094158861"} Apr 16 10:04:59.101536 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:59.101475 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" event={"ID":"2bdec43e-45b2-4aa1-8605-fa162a8fc58b","Type":"ContainerStarted","Data":"eac5d280d8e4bcc64e5b6d439db571ae92b9d7ff6eadac02bf3ce641727a5edb"} Apr 16 10:04:59.101536 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:59.101486 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" event={"ID":"2bdec43e-45b2-4aa1-8605-fa162a8fc58b","Type":"ContainerStarted","Data":"61fb2012aa1d559d4a89e1d417dc54f1925d6812d03e84868cf513329c63327e"} Apr 16 10:04:59.101536 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:59.101501 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" event={"ID":"2bdec43e-45b2-4aa1-8605-fa162a8fc58b","Type":"ContainerStarted","Data":"c33318fbed41243391474c02a19070f7a5f725daa5046b8caf275a220f36fb93"} Apr 16 10:04:59.109103 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:59.109061 2525 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-htz8x" podStartSLOduration=3.845501554 podStartE2EDuration="21.109050512s" podCreationTimestamp="2026-04-16 10:04:38 +0000 UTC" firstStartedPulling="2026-04-16 10:04:40.450475432 +0000 UTC m=+3.049923367" lastFinishedPulling="2026-04-16 10:04:57.714024372 +0000 UTC m=+20.313472325" observedRunningTime="2026-04-16 10:04:59.10886938 +0000 UTC m=+21.708317335" watchObservedRunningTime="2026-04-16 10:04:59.109050512 +0000 UTC m=+21.708498468" Apr 16 10:04:59.175345 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:59.175294 2525 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-mfrrt" podStartSLOduration=3.914386296 podStartE2EDuration="21.175280473s" podCreationTimestamp="2026-04-16 10:04:38 +0000 UTC" firstStartedPulling="2026-04-16 10:04:40.45657409 +0000 UTC m=+3.056022041" lastFinishedPulling="2026-04-16 10:04:57.717468281 +0000 UTC m=+20.316916218" observedRunningTime="2026-04-16 10:04:59.17502244 +0000 UTC m=+21.774470408" watchObservedRunningTime="2026-04-16 10:04:59.175280473 +0000 UTC m=+21.774728428" Apr 16 10:04:59.191023 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:59.190961 2525 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-pmkq5" podStartSLOduration=3.615928076 podStartE2EDuration="21.190946434s" podCreationTimestamp="2026-04-16 10:04:38 +0000 UTC" firstStartedPulling="2026-04-16 10:04:40.448454193 +0000 UTC m=+3.047902127" lastFinishedPulling="2026-04-16 10:04:58.023472548 +0000 UTC m=+20.622920485" observedRunningTime="2026-04-16 10:04:59.190463654 +0000 UTC m=+21.789911610" watchObservedRunningTime="2026-04-16 10:04:59.190946434 +0000 UTC m=+21.790394387" Apr 16 10:04:59.479154 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:59.479132 2525 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 10:04:59.925590 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:59.925449 2525 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T10:04:59.479149124Z","UUID":"562303ba-c38a-4d75-b20f-af287f91bb3b","Handler":null,"Name":"","Endpoint":""} Apr 16 10:04:59.928884 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:59.928853 2525 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 10:04:59.928884 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:59.928882 2525 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 10:04:59.988604 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:04:59.988556 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nz7jd" Apr 16 10:04:59.988790 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:04:59.988688 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nz7jd" podUID="19d87478-ac42-469f-a6d5-8ca391f485f6" Apr 16 10:05:00.106300 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:00.106261 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t5bxv" event={"ID":"cd617d7e-1f50-481b-ab3f-e9a32bb0d538","Type":"ContainerStarted","Data":"41d560872963e929ba86c0317f7b51395488309977fc6161533ad1ff2fdcfa64"} Apr 16 10:05:00.988446 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:00.988406 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mczdx" Apr 16 10:05:00.988654 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:00.988569 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mczdx" podUID="4739af65-2ca2-4e8f-9b0e-ddeac76a9b66" Apr 16 10:05:01.110185 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:01.110142 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t5bxv" event={"ID":"cd617d7e-1f50-481b-ab3f-e9a32bb0d538","Type":"ContainerStarted","Data":"220caac3992a7e9a86567d7b609e1d14cc96cd5765249999b0cb0f28aded04e9"} Apr 16 10:05:01.113806 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:01.113768 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" event={"ID":"2bdec43e-45b2-4aa1-8605-fa162a8fc58b","Type":"ContainerStarted","Data":"9bf7b5298bbf19620f0bc4eb9d762886d92eedec0ccd464d006715353fa47cc2"} Apr 16 10:05:01.125603 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:01.125557 2525 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t5bxv" podStartSLOduration=3.166603058 podStartE2EDuration="23.125543428s" podCreationTimestamp="2026-04-16 10:04:38 +0000 UTC" firstStartedPulling="2026-04-16 10:04:40.454713107 +0000 UTC m=+3.054161045" lastFinishedPulling="2026-04-16 10:05:00.413653467 +0000 UTC m=+23.013101415" observedRunningTime="2026-04-16 10:05:01.125234277 +0000 UTC m=+23.724682234" watchObservedRunningTime="2026-04-16 10:05:01.125543428 +0000 UTC m=+23.724991383" Apr 16 10:05:01.869594 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:01.869314 2525 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-pwsbd" Apr 16 10:05:01.870073 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:01.870048 2525 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-pwsbd" Apr 16 10:05:01.988341 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:01.988310 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nz7jd" Apr 16 10:05:01.988551 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:01.988429 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nz7jd" podUID="19d87478-ac42-469f-a6d5-8ca391f485f6" Apr 16 10:05:02.988375 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:02.988293 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mczdx" Apr 16 10:05:02.988773 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:02.988434 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mczdx" podUID="4739af65-2ca2-4e8f-9b0e-ddeac76a9b66" Apr 16 10:05:03.989090 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:03.988874 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nz7jd" Apr 16 10:05:03.989757 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:03.989196 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nz7jd" podUID="19d87478-ac42-469f-a6d5-8ca391f485f6" Apr 16 10:05:04.121352 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:04.121250 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" event={"ID":"2bdec43e-45b2-4aa1-8605-fa162a8fc58b","Type":"ContainerStarted","Data":"7fa3635b6c9a4f9d9472202491ff752d9c6720065f7c61186ed4a62496553916"} Apr 16 10:05:04.121593 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:04.121559 2525 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:05:04.122821 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:04.122795 2525 generic.go:358] "Generic (PLEG): container finished" podID="dc3554db-b5f4-43be-8180-e48573e27cb6" containerID="a069b05ee53a5696b98c51532252d09d48348a3c170f04b97e6402d9b1c16332" exitCode=0 Apr 16 10:05:04.122901 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:04.122828 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fpw2w" event={"ID":"dc3554db-b5f4-43be-8180-e48573e27cb6","Type":"ContainerDied","Data":"a069b05ee53a5696b98c51532252d09d48348a3c170f04b97e6402d9b1c16332"} Apr 16 10:05:04.136971 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:04.136933 2525 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:05:04.151197 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:04.151145 2525 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" podStartSLOduration=8.481882889 podStartE2EDuration="26.151131341s" podCreationTimestamp="2026-04-16 10:04:38 +0000 UTC" firstStartedPulling="2026-04-16 10:04:40.453049553 +0000 UTC m=+3.052497487" lastFinishedPulling="2026-04-16 10:04:58.122298005 +0000 UTC m=+20.721745939" observedRunningTime="2026-04-16 10:05:04.150804604 +0000 UTC m=+26.750252560" watchObservedRunningTime="2026-04-16 10:05:04.151131341 +0000 UTC m=+26.750579296" Apr 16 10:05:04.988702 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:04.988667 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mczdx" Apr 16 10:05:04.988971 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:04.988795 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mczdx" podUID="4739af65-2ca2-4e8f-9b0e-ddeac76a9b66" Apr 16 10:05:05.126827 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:05.126808 2525 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:05:05.127131 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:05.126854 2525 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:05:05.142790 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:05.142760 2525 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:05:05.292183 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:05.292148 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-nz7jd"] Apr 16 10:05:05.292356 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:05.292286 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nz7jd" Apr 16 10:05:05.292421 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:05.292381 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nz7jd" podUID="19d87478-ac42-469f-a6d5-8ca391f485f6" Apr 16 10:05:05.294676 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:05.294650 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mczdx"] Apr 16 10:05:05.294805 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:05.294750 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mczdx" Apr 16 10:05:05.294875 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:05.294855 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mczdx" podUID="4739af65-2ca2-4e8f-9b0e-ddeac76a9b66" Apr 16 10:05:06.127860 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:06.127829 2525 generic.go:358] "Generic (PLEG): container finished" podID="dc3554db-b5f4-43be-8180-e48573e27cb6" containerID="c625e1c11b53ffa8a8c677e21f611898278eb83bd503c30e98509684ecad332c" exitCode=0 Apr 16 10:05:06.128228 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:06.127915 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fpw2w" event={"ID":"dc3554db-b5f4-43be-8180-e48573e27cb6","Type":"ContainerDied","Data":"c625e1c11b53ffa8a8c677e21f611898278eb83bd503c30e98509684ecad332c"} Apr 16 10:05:06.402858 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:06.402778 2525 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-pwsbd" Apr 16 10:05:06.402998 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:06.402921 2525 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 10:05:06.403395 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:06.403378 2525 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-pwsbd" Apr 16 10:05:06.988113 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:06.988040 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mczdx" Apr 16 10:05:06.988252 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:06.988045 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nz7jd" Apr 16 10:05:06.988252 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:06.988154 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mczdx" podUID="4739af65-2ca2-4e8f-9b0e-ddeac76a9b66" Apr 16 10:05:06.988252 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:06.988240 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nz7jd" podUID="19d87478-ac42-469f-a6d5-8ca391f485f6" Apr 16 10:05:07.131600 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:07.131375 2525 generic.go:358] "Generic (PLEG): container finished" podID="dc3554db-b5f4-43be-8180-e48573e27cb6" containerID="37ffdc49bfe83dee0b45140efa126275ef70684bda2e1d095df6f6565c73137a" exitCode=0 Apr 16 10:05:07.132002 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:07.131421 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fpw2w" event={"ID":"dc3554db-b5f4-43be-8180-e48573e27cb6","Type":"ContainerDied","Data":"37ffdc49bfe83dee0b45140efa126275ef70684bda2e1d095df6f6565c73137a"} Apr 16 10:05:08.988200 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:08.988166 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mczdx" Apr 16 10:05:08.988200 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:08.988196 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nz7jd" Apr 16 10:05:08.988681 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:08.988305 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mczdx" podUID="4739af65-2ca2-4e8f-9b0e-ddeac76a9b66" Apr 16 10:05:08.988681 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:08.988498 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nz7jd" podUID="19d87478-ac42-469f-a6d5-8ca391f485f6" Apr 16 10:05:10.770073 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:10.769994 2525 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-1.ec2.internal" event="NodeReady" Apr 16 10:05:10.770563 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:10.770138 2525 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 10:05:10.817734 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:10.817694 2525 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-sh9pl"] Apr 16 10:05:10.852444 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:10.852413 2525 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-xnnjq"] Apr 16 10:05:10.852647 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:10.852610 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sh9pl" Apr 16 10:05:10.855190 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:10.855163 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 10:05:10.855329 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:10.855209 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ccsn6\"" Apr 16 10:05:10.855329 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:10.855170 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 10:05:10.868137 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:10.868099 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-sh9pl"] Apr 16 10:05:10.868137 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:10.868140 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xnnjq"] Apr 16 10:05:10.868357 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:10.868261 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xnnjq" Apr 16 10:05:10.870289 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:10.870266 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 10:05:10.870427 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:10.870338 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 10:05:10.870566 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:10.870522 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-cwtl6\"" Apr 16 10:05:10.870648 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:10.870591 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 10:05:10.988220 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:10.988180 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nz7jd" Apr 16 10:05:10.988396 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:10.988180 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mczdx" Apr 16 10:05:10.990757 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:10.990732 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 10:05:10.990757 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:10.990747 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-mvsr8\"" Apr 16 10:05:10.990959 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:10.990790 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 10:05:10.990959 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:10.990902 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-btvgh\"" Apr 16 10:05:10.991069 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:10.990965 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 10:05:11.010689 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:11.010666 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7kts\" (UniqueName: \"kubernetes.io/projected/86d6efca-ec4a-40d2-a200-6d8dacba5368-kube-api-access-j7kts\") pod \"dns-default-sh9pl\" (UID: \"86d6efca-ec4a-40d2-a200-6d8dacba5368\") " pod="openshift-dns/dns-default-sh9pl" Apr 16 10:05:11.010821 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:11.010708 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4dz6\" (UniqueName: \"kubernetes.io/projected/4a03f4b8-27ee-4dd2-8150-2b73a73e4f06-kube-api-access-x4dz6\") pod \"ingress-canary-xnnjq\" (UID: \"4a03f4b8-27ee-4dd2-8150-2b73a73e4f06\") " pod="openshift-ingress-canary/ingress-canary-xnnjq" Apr 16 10:05:11.010821 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:11.010735 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86d6efca-ec4a-40d2-a200-6d8dacba5368-config-volume\") pod \"dns-default-sh9pl\" (UID: \"86d6efca-ec4a-40d2-a200-6d8dacba5368\") " pod="openshift-dns/dns-default-sh9pl" Apr 16 10:05:11.010821 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:11.010763 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/86d6efca-ec4a-40d2-a200-6d8dacba5368-tmp-dir\") pod \"dns-default-sh9pl\" (UID: \"86d6efca-ec4a-40d2-a200-6d8dacba5368\") " pod="openshift-dns/dns-default-sh9pl" Apr 16 10:05:11.010821 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:11.010807 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a03f4b8-27ee-4dd2-8150-2b73a73e4f06-cert\") pod \"ingress-canary-xnnjq\" (UID: \"4a03f4b8-27ee-4dd2-8150-2b73a73e4f06\") " pod="openshift-ingress-canary/ingress-canary-xnnjq" Apr 16 10:05:11.010980 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:11.010909 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86d6efca-ec4a-40d2-a200-6d8dacba5368-metrics-tls\") pod \"dns-default-sh9pl\" (UID: \"86d6efca-ec4a-40d2-a200-6d8dacba5368\") " pod="openshift-dns/dns-default-sh9pl" Apr 16 10:05:11.111644 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:11.111606 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a03f4b8-27ee-4dd2-8150-2b73a73e4f06-cert\") pod \"ingress-canary-xnnjq\" (UID: \"4a03f4b8-27ee-4dd2-8150-2b73a73e4f06\") " pod="openshift-ingress-canary/ingress-canary-xnnjq" Apr 16 10:05:11.111835 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:11.111653 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86d6efca-ec4a-40d2-a200-6d8dacba5368-metrics-tls\") pod \"dns-default-sh9pl\" (UID: \"86d6efca-ec4a-40d2-a200-6d8dacba5368\") " pod="openshift-dns/dns-default-sh9pl" Apr 16 10:05:11.111835 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:11.111686 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j7kts\" (UniqueName: \"kubernetes.io/projected/86d6efca-ec4a-40d2-a200-6d8dacba5368-kube-api-access-j7kts\") pod \"dns-default-sh9pl\" (UID: \"86d6efca-ec4a-40d2-a200-6d8dacba5368\") " pod="openshift-dns/dns-default-sh9pl" Apr 16 10:05:11.111835 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:11.111701 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x4dz6\" (UniqueName: \"kubernetes.io/projected/4a03f4b8-27ee-4dd2-8150-2b73a73e4f06-kube-api-access-x4dz6\") pod \"ingress-canary-xnnjq\" (UID: \"4a03f4b8-27ee-4dd2-8150-2b73a73e4f06\") " pod="openshift-ingress-canary/ingress-canary-xnnjq" Apr 16 10:05:11.111835 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:11.111730 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86d6efca-ec4a-40d2-a200-6d8dacba5368-config-volume\") pod \"dns-default-sh9pl\" (UID: \"86d6efca-ec4a-40d2-a200-6d8dacba5368\") " pod="openshift-dns/dns-default-sh9pl" Apr 16 10:05:11.111835 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:11.111753 2525 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 10:05:11.111835 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:11.111773 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/86d6efca-ec4a-40d2-a200-6d8dacba5368-tmp-dir\") pod \"dns-default-sh9pl\" (UID: \"86d6efca-ec4a-40d2-a200-6d8dacba5368\") " pod="openshift-dns/dns-default-sh9pl" Apr 16 10:05:11.111835 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:11.111823 2525 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 10:05:11.112161 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:11.111826 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a03f4b8-27ee-4dd2-8150-2b73a73e4f06-cert podName:4a03f4b8-27ee-4dd2-8150-2b73a73e4f06 nodeName:}" failed. No retries permitted until 2026-04-16 10:05:11.611804207 +0000 UTC m=+34.211252142 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4a03f4b8-27ee-4dd2-8150-2b73a73e4f06-cert") pod "ingress-canary-xnnjq" (UID: "4a03f4b8-27ee-4dd2-8150-2b73a73e4f06") : secret "canary-serving-cert" not found Apr 16 10:05:11.112161 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:11.111905 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86d6efca-ec4a-40d2-a200-6d8dacba5368-metrics-tls podName:86d6efca-ec4a-40d2-a200-6d8dacba5368 nodeName:}" failed. No retries permitted until 2026-04-16 10:05:11.61188645 +0000 UTC m=+34.211334391 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86d6efca-ec4a-40d2-a200-6d8dacba5368-metrics-tls") pod "dns-default-sh9pl" (UID: "86d6efca-ec4a-40d2-a200-6d8dacba5368") : secret "dns-default-metrics-tls" not found Apr 16 10:05:11.112161 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:11.112136 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/86d6efca-ec4a-40d2-a200-6d8dacba5368-tmp-dir\") pod \"dns-default-sh9pl\" (UID: \"86d6efca-ec4a-40d2-a200-6d8dacba5368\") " pod="openshift-dns/dns-default-sh9pl" Apr 16 10:05:11.112368 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:11.112346 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86d6efca-ec4a-40d2-a200-6d8dacba5368-config-volume\") pod \"dns-default-sh9pl\" (UID: \"86d6efca-ec4a-40d2-a200-6d8dacba5368\") " pod="openshift-dns/dns-default-sh9pl" Apr 16 10:05:11.124094 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:11.124067 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7kts\" (UniqueName: \"kubernetes.io/projected/86d6efca-ec4a-40d2-a200-6d8dacba5368-kube-api-access-j7kts\") pod \"dns-default-sh9pl\" (UID: \"86d6efca-ec4a-40d2-a200-6d8dacba5368\") " pod="openshift-dns/dns-default-sh9pl" Apr 16 10:05:11.135655 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:11.135633 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4dz6\" (UniqueName: \"kubernetes.io/projected/4a03f4b8-27ee-4dd2-8150-2b73a73e4f06-kube-api-access-x4dz6\") pod \"ingress-canary-xnnjq\" (UID: \"4a03f4b8-27ee-4dd2-8150-2b73a73e4f06\") " pod="openshift-ingress-canary/ingress-canary-xnnjq" Apr 16 10:05:11.615210 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:11.615169 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a03f4b8-27ee-4dd2-8150-2b73a73e4f06-cert\") pod \"ingress-canary-xnnjq\" (UID: \"4a03f4b8-27ee-4dd2-8150-2b73a73e4f06\") " pod="openshift-ingress-canary/ingress-canary-xnnjq" Apr 16 10:05:11.615393 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:11.615232 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86d6efca-ec4a-40d2-a200-6d8dacba5368-metrics-tls\") pod \"dns-default-sh9pl\" (UID: \"86d6efca-ec4a-40d2-a200-6d8dacba5368\") " pod="openshift-dns/dns-default-sh9pl" Apr 16 10:05:11.615393 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:11.615335 2525 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 10:05:11.615496 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:11.615417 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a03f4b8-27ee-4dd2-8150-2b73a73e4f06-cert podName:4a03f4b8-27ee-4dd2-8150-2b73a73e4f06 nodeName:}" failed. No retries permitted until 2026-04-16 10:05:12.615399466 +0000 UTC m=+35.214847400 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4a03f4b8-27ee-4dd2-8150-2b73a73e4f06-cert") pod "ingress-canary-xnnjq" (UID: "4a03f4b8-27ee-4dd2-8150-2b73a73e4f06") : secret "canary-serving-cert" not found Apr 16 10:05:11.615496 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:11.615335 2525 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 10:05:11.615496 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:11.615482 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86d6efca-ec4a-40d2-a200-6d8dacba5368-metrics-tls podName:86d6efca-ec4a-40d2-a200-6d8dacba5368 nodeName:}" failed. No retries permitted until 2026-04-16 10:05:12.6154688 +0000 UTC m=+35.214916741 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86d6efca-ec4a-40d2-a200-6d8dacba5368-metrics-tls") pod "dns-default-sh9pl" (UID: "86d6efca-ec4a-40d2-a200-6d8dacba5368") : secret "dns-default-metrics-tls" not found Apr 16 10:05:11.715944 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:11.715908 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztrhm\" (UniqueName: \"kubernetes.io/projected/19d87478-ac42-469f-a6d5-8ca391f485f6-kube-api-access-ztrhm\") pod \"network-check-target-nz7jd\" (UID: \"19d87478-ac42-469f-a6d5-8ca391f485f6\") " pod="openshift-network-diagnostics/network-check-target-nz7jd" Apr 16 10:05:11.716128 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:11.715979 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4739af65-2ca2-4e8f-9b0e-ddeac76a9b66-metrics-certs\") pod \"network-metrics-daemon-mczdx\" (UID: \"4739af65-2ca2-4e8f-9b0e-ddeac76a9b66\") " pod="openshift-multus/network-metrics-daemon-mczdx" Apr 16 10:05:11.716128 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:11.716105 2525 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 10:05:11.716256 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:11.716178 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4739af65-2ca2-4e8f-9b0e-ddeac76a9b66-metrics-certs podName:4739af65-2ca2-4e8f-9b0e-ddeac76a9b66 nodeName:}" failed. No retries permitted until 2026-04-16 10:05:43.716157839 +0000 UTC m=+66.315605779 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4739af65-2ca2-4e8f-9b0e-ddeac76a9b66-metrics-certs") pod "network-metrics-daemon-mczdx" (UID: "4739af65-2ca2-4e8f-9b0e-ddeac76a9b66") : secret "metrics-daemon-secret" not found Apr 16 10:05:11.719133 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:11.719102 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztrhm\" (UniqueName: \"kubernetes.io/projected/19d87478-ac42-469f-a6d5-8ca391f485f6-kube-api-access-ztrhm\") pod \"network-check-target-nz7jd\" (UID: \"19d87478-ac42-469f-a6d5-8ca391f485f6\") " pod="openshift-network-diagnostics/network-check-target-nz7jd" Apr 16 10:05:11.900246 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:11.900151 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nz7jd" Apr 16 10:05:12.623465 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:12.623103 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a03f4b8-27ee-4dd2-8150-2b73a73e4f06-cert\") pod \"ingress-canary-xnnjq\" (UID: \"4a03f4b8-27ee-4dd2-8150-2b73a73e4f06\") " pod="openshift-ingress-canary/ingress-canary-xnnjq" Apr 16 10:05:12.623465 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:12.623332 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86d6efca-ec4a-40d2-a200-6d8dacba5368-metrics-tls\") pod \"dns-default-sh9pl\" (UID: \"86d6efca-ec4a-40d2-a200-6d8dacba5368\") " pod="openshift-dns/dns-default-sh9pl" Apr 16 10:05:12.623465 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:12.623245 2525 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 10:05:12.623465 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:12.623417 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a03f4b8-27ee-4dd2-8150-2b73a73e4f06-cert podName:4a03f4b8-27ee-4dd2-8150-2b73a73e4f06 nodeName:}" failed. No retries permitted until 2026-04-16 10:05:14.623398487 +0000 UTC m=+37.222846443 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4a03f4b8-27ee-4dd2-8150-2b73a73e4f06-cert") pod "ingress-canary-xnnjq" (UID: "4a03f4b8-27ee-4dd2-8150-2b73a73e4f06") : secret "canary-serving-cert" not found Apr 16 10:05:12.623465 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:12.623429 2525 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 10:05:12.623465 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:12.623466 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86d6efca-ec4a-40d2-a200-6d8dacba5368-metrics-tls podName:86d6efca-ec4a-40d2-a200-6d8dacba5368 nodeName:}" failed. No retries permitted until 2026-04-16 10:05:14.623458239 +0000 UTC m=+37.222906176 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86d6efca-ec4a-40d2-a200-6d8dacba5368-metrics-tls") pod "dns-default-sh9pl" (UID: "86d6efca-ec4a-40d2-a200-6d8dacba5368") : secret "dns-default-metrics-tls" not found Apr 16 10:05:12.696542 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:12.696492 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-nz7jd"] Apr 16 10:05:12.699914 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:05:12.699887 2525 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19d87478_ac42_469f_a6d5_8ca391f485f6.slice/crio-d41fe14c87402d66df586d399e4bdbc74f416caa51ae0d15923d172cb89fa5d7 WatchSource:0}: Error finding container d41fe14c87402d66df586d399e4bdbc74f416caa51ae0d15923d172cb89fa5d7: Status 404 returned error can't find the container with id d41fe14c87402d66df586d399e4bdbc74f416caa51ae0d15923d172cb89fa5d7 Apr 16 10:05:13.145341 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:13.145301 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fpw2w" event={"ID":"dc3554db-b5f4-43be-8180-e48573e27cb6","Type":"ContainerStarted","Data":"e2f0a93135860cc5685001a8f0bf500cabed5b120536d85c206191eb0f58cc00"} Apr 16 10:05:13.146328 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:13.146306 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-nz7jd" event={"ID":"19d87478-ac42-469f-a6d5-8ca391f485f6","Type":"ContainerStarted","Data":"d41fe14c87402d66df586d399e4bdbc74f416caa51ae0d15923d172cb89fa5d7"} Apr 16 10:05:14.150956 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:14.150917 2525 generic.go:358] "Generic (PLEG): container finished" podID="dc3554db-b5f4-43be-8180-e48573e27cb6" containerID="e2f0a93135860cc5685001a8f0bf500cabed5b120536d85c206191eb0f58cc00" exitCode=0 Apr 16 10:05:14.151348 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:14.150994 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fpw2w" event={"ID":"dc3554db-b5f4-43be-8180-e48573e27cb6","Type":"ContainerDied","Data":"e2f0a93135860cc5685001a8f0bf500cabed5b120536d85c206191eb0f58cc00"} Apr 16 10:05:14.641337 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:14.641298 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86d6efca-ec4a-40d2-a200-6d8dacba5368-metrics-tls\") pod \"dns-default-sh9pl\" (UID: \"86d6efca-ec4a-40d2-a200-6d8dacba5368\") " pod="openshift-dns/dns-default-sh9pl" Apr 16 10:05:14.641525 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:14.641391 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a03f4b8-27ee-4dd2-8150-2b73a73e4f06-cert\") pod \"ingress-canary-xnnjq\" (UID: \"4a03f4b8-27ee-4dd2-8150-2b73a73e4f06\") " pod="openshift-ingress-canary/ingress-canary-xnnjq" Apr 16 10:05:14.641525 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:14.641459 2525 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 10:05:14.641525 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:14.641500 2525 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 10:05:14.641664 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:14.641550 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86d6efca-ec4a-40d2-a200-6d8dacba5368-metrics-tls podName:86d6efca-ec4a-40d2-a200-6d8dacba5368 nodeName:}" failed. No retries permitted until 2026-04-16 10:05:18.641531434 +0000 UTC m=+41.240979371 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86d6efca-ec4a-40d2-a200-6d8dacba5368-metrics-tls") pod "dns-default-sh9pl" (UID: "86d6efca-ec4a-40d2-a200-6d8dacba5368") : secret "dns-default-metrics-tls" not found Apr 16 10:05:14.641664 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:14.641566 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a03f4b8-27ee-4dd2-8150-2b73a73e4f06-cert podName:4a03f4b8-27ee-4dd2-8150-2b73a73e4f06 nodeName:}" failed. No retries permitted until 2026-04-16 10:05:18.64156048 +0000 UTC m=+41.241008415 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4a03f4b8-27ee-4dd2-8150-2b73a73e4f06-cert") pod "ingress-canary-xnnjq" (UID: "4a03f4b8-27ee-4dd2-8150-2b73a73e4f06") : secret "canary-serving-cert" not found Apr 16 10:05:15.156012 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:15.155975 2525 generic.go:358] "Generic (PLEG): container finished" podID="dc3554db-b5f4-43be-8180-e48573e27cb6" containerID="bf9c3c75986a651a4c0f8953e3bf501715a138bf3d02474564dddfa1b5058afa" exitCode=0 Apr 16 10:05:15.156425 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:15.156046 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fpw2w" event={"ID":"dc3554db-b5f4-43be-8180-e48573e27cb6","Type":"ContainerDied","Data":"bf9c3c75986a651a4c0f8953e3bf501715a138bf3d02474564dddfa1b5058afa"} Apr 16 10:05:16.161671 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:16.161343 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fpw2w" event={"ID":"dc3554db-b5f4-43be-8180-e48573e27cb6","Type":"ContainerStarted","Data":"faa04aacbd2359c96f4d4718c605645dd25d8765a312aa6bfe024a086dfb4acb"} Apr 16 10:05:16.162683 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:16.162661 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-nz7jd" event={"ID":"19d87478-ac42-469f-a6d5-8ca391f485f6","Type":"ContainerStarted","Data":"6232762e2f56592c606a55f9fe04efe0e118d37825bb06e091fc8180bd305112"} Apr 16 10:05:16.162787 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:16.162779 2525 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-nz7jd" Apr 16 10:05:16.183257 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:16.182282 2525 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-fpw2w" podStartSLOduration=5.738590803 podStartE2EDuration="38.182264248s" podCreationTimestamp="2026-04-16 10:04:38 +0000 UTC" firstStartedPulling="2026-04-16 10:04:40.459074018 +0000 UTC m=+3.058521965" lastFinishedPulling="2026-04-16 10:05:12.902747473 +0000 UTC m=+35.502195410" observedRunningTime="2026-04-16 10:05:16.181899672 +0000 UTC m=+38.781347620" watchObservedRunningTime="2026-04-16 10:05:16.182264248 +0000 UTC m=+38.781712204" Apr 16 10:05:16.195847 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:16.195793 2525 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-nz7jd" podStartSLOduration=35.033920116 podStartE2EDuration="38.195775368s" podCreationTimestamp="2026-04-16 10:04:38 +0000 UTC" firstStartedPulling="2026-04-16 10:05:12.701994778 +0000 UTC m=+35.301442712" lastFinishedPulling="2026-04-16 10:05:15.863850027 +0000 UTC m=+38.463297964" observedRunningTime="2026-04-16 10:05:16.195209629 +0000 UTC m=+38.794657586" watchObservedRunningTime="2026-04-16 10:05:16.195775368 +0000 UTC m=+38.795223327" Apr 16 10:05:18.670705 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:18.670650 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a03f4b8-27ee-4dd2-8150-2b73a73e4f06-cert\") pod \"ingress-canary-xnnjq\" (UID: \"4a03f4b8-27ee-4dd2-8150-2b73a73e4f06\") " pod="openshift-ingress-canary/ingress-canary-xnnjq" Apr 16 10:05:18.670705 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:18.670703 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86d6efca-ec4a-40d2-a200-6d8dacba5368-metrics-tls\") pod \"dns-default-sh9pl\" (UID: \"86d6efca-ec4a-40d2-a200-6d8dacba5368\") " pod="openshift-dns/dns-default-sh9pl" Apr 16 10:05:18.671239 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:18.670814 2525 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 10:05:18.671239 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:18.670823 2525 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 10:05:18.671239 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:18.670865 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86d6efca-ec4a-40d2-a200-6d8dacba5368-metrics-tls podName:86d6efca-ec4a-40d2-a200-6d8dacba5368 nodeName:}" failed. No retries permitted until 2026-04-16 10:05:26.670852896 +0000 UTC m=+49.270300829 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86d6efca-ec4a-40d2-a200-6d8dacba5368-metrics-tls") pod "dns-default-sh9pl" (UID: "86d6efca-ec4a-40d2-a200-6d8dacba5368") : secret "dns-default-metrics-tls" not found Apr 16 10:05:18.671239 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:18.670897 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a03f4b8-27ee-4dd2-8150-2b73a73e4f06-cert podName:4a03f4b8-27ee-4dd2-8150-2b73a73e4f06 nodeName:}" failed. No retries permitted until 2026-04-16 10:05:26.670883556 +0000 UTC m=+49.270331493 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4a03f4b8-27ee-4dd2-8150-2b73a73e4f06-cert") pod "ingress-canary-xnnjq" (UID: "4a03f4b8-27ee-4dd2-8150-2b73a73e4f06") : secret "canary-serving-cert" not found Apr 16 10:05:26.099453 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:26.099413 2525 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-lkz6h"] Apr 16 10:05:26.117763 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:26.117733 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-lkz6h"] Apr 16 10:05:26.117912 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:26.117853 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lkz6h" Apr 16 10:05:26.120775 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:26.120746 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 10:05:26.222907 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:26.222863 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/38ea256d-b16d-4ed9-9a51-5a9007157783-original-pull-secret\") pod \"global-pull-secret-syncer-lkz6h\" (UID: \"38ea256d-b16d-4ed9-9a51-5a9007157783\") " pod="kube-system/global-pull-secret-syncer-lkz6h" Apr 16 10:05:26.223102 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:26.222962 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/38ea256d-b16d-4ed9-9a51-5a9007157783-kubelet-config\") pod \"global-pull-secret-syncer-lkz6h\" (UID: \"38ea256d-b16d-4ed9-9a51-5a9007157783\") " pod="kube-system/global-pull-secret-syncer-lkz6h" Apr 16 10:05:26.223102 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:26.223042 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/38ea256d-b16d-4ed9-9a51-5a9007157783-dbus\") pod \"global-pull-secret-syncer-lkz6h\" (UID: \"38ea256d-b16d-4ed9-9a51-5a9007157783\") " pod="kube-system/global-pull-secret-syncer-lkz6h" Apr 16 10:05:26.323462 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:26.323426 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/38ea256d-b16d-4ed9-9a51-5a9007157783-original-pull-secret\") pod \"global-pull-secret-syncer-lkz6h\" (UID: \"38ea256d-b16d-4ed9-9a51-5a9007157783\") " pod="kube-system/global-pull-secret-syncer-lkz6h" Apr 16 10:05:26.323615 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:26.323481 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/38ea256d-b16d-4ed9-9a51-5a9007157783-kubelet-config\") pod \"global-pull-secret-syncer-lkz6h\" (UID: \"38ea256d-b16d-4ed9-9a51-5a9007157783\") " pod="kube-system/global-pull-secret-syncer-lkz6h" Apr 16 10:05:26.323615 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:26.323503 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/38ea256d-b16d-4ed9-9a51-5a9007157783-dbus\") pod \"global-pull-secret-syncer-lkz6h\" (UID: \"38ea256d-b16d-4ed9-9a51-5a9007157783\") " pod="kube-system/global-pull-secret-syncer-lkz6h" Apr 16 10:05:26.323615 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:26.323590 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/38ea256d-b16d-4ed9-9a51-5a9007157783-kubelet-config\") pod \"global-pull-secret-syncer-lkz6h\" (UID: \"38ea256d-b16d-4ed9-9a51-5a9007157783\") " pod="kube-system/global-pull-secret-syncer-lkz6h" Apr 16 10:05:26.323731 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:26.323694 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/38ea256d-b16d-4ed9-9a51-5a9007157783-dbus\") pod \"global-pull-secret-syncer-lkz6h\" (UID: \"38ea256d-b16d-4ed9-9a51-5a9007157783\") " pod="kube-system/global-pull-secret-syncer-lkz6h" Apr 16 10:05:26.364530 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:26.364422 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/38ea256d-b16d-4ed9-9a51-5a9007157783-original-pull-secret\") pod \"global-pull-secret-syncer-lkz6h\" (UID: \"38ea256d-b16d-4ed9-9a51-5a9007157783\") " pod="kube-system/global-pull-secret-syncer-lkz6h" Apr 16 10:05:26.426371 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:26.426316 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lkz6h" Apr 16 10:05:26.560301 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:26.560270 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-lkz6h"] Apr 16 10:05:26.563756 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:05:26.563725 2525 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38ea256d_b16d_4ed9_9a51_5a9007157783.slice/crio-d00e2394c5466f2ee5c1f35566d00f266744773cfc1d449c2dbc86579f2f495d WatchSource:0}: Error finding container d00e2394c5466f2ee5c1f35566d00f266744773cfc1d449c2dbc86579f2f495d: Status 404 returned error can't find the container with id d00e2394c5466f2ee5c1f35566d00f266744773cfc1d449c2dbc86579f2f495d Apr 16 10:05:26.727163 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:26.727080 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a03f4b8-27ee-4dd2-8150-2b73a73e4f06-cert\") pod \"ingress-canary-xnnjq\" (UID: \"4a03f4b8-27ee-4dd2-8150-2b73a73e4f06\") " pod="openshift-ingress-canary/ingress-canary-xnnjq" Apr 16 10:05:26.727163 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:26.727132 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86d6efca-ec4a-40d2-a200-6d8dacba5368-metrics-tls\") pod \"dns-default-sh9pl\" (UID: \"86d6efca-ec4a-40d2-a200-6d8dacba5368\") " pod="openshift-dns/dns-default-sh9pl" Apr 16 10:05:26.727349 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:26.727237 2525 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 10:05:26.727349 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:26.727248 2525 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 10:05:26.727349 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:26.727289 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86d6efca-ec4a-40d2-a200-6d8dacba5368-metrics-tls podName:86d6efca-ec4a-40d2-a200-6d8dacba5368 nodeName:}" failed. No retries permitted until 2026-04-16 10:05:42.727275636 +0000 UTC m=+65.326723570 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86d6efca-ec4a-40d2-a200-6d8dacba5368-metrics-tls") pod "dns-default-sh9pl" (UID: "86d6efca-ec4a-40d2-a200-6d8dacba5368") : secret "dns-default-metrics-tls" not found Apr 16 10:05:26.727349 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:26.727316 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a03f4b8-27ee-4dd2-8150-2b73a73e4f06-cert podName:4a03f4b8-27ee-4dd2-8150-2b73a73e4f06 nodeName:}" failed. No retries permitted until 2026-04-16 10:05:42.727298267 +0000 UTC m=+65.326746204 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4a03f4b8-27ee-4dd2-8150-2b73a73e4f06-cert") pod "ingress-canary-xnnjq" (UID: "4a03f4b8-27ee-4dd2-8150-2b73a73e4f06") : secret "canary-serving-cert" not found Apr 16 10:05:27.182692 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:27.182654 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-lkz6h" event={"ID":"38ea256d-b16d-4ed9-9a51-5a9007157783","Type":"ContainerStarted","Data":"d00e2394c5466f2ee5c1f35566d00f266744773cfc1d449c2dbc86579f2f495d"} Apr 16 10:05:32.193863 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:32.193826 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-lkz6h" event={"ID":"38ea256d-b16d-4ed9-9a51-5a9007157783","Type":"ContainerStarted","Data":"b885afccfd9d04c5ed9d42753466932ec59c7869eb494b6135bb084ca990b698"} Apr 16 10:05:32.208430 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:32.208381 2525 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-lkz6h" podStartSLOduration=1.536434444 podStartE2EDuration="6.208367395s" podCreationTimestamp="2026-04-16 10:05:26 +0000 UTC" firstStartedPulling="2026-04-16 10:05:26.565505516 +0000 UTC m=+49.164953451" lastFinishedPulling="2026-04-16 10:05:31.237438465 +0000 UTC m=+53.836886402" observedRunningTime="2026-04-16 10:05:32.207935118 +0000 UTC m=+54.807383076" watchObservedRunningTime="2026-04-16 10:05:32.208367395 +0000 UTC m=+54.807815351" Apr 16 10:05:37.142013 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:37.141985 2525 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rk78v" Apr 16 10:05:42.746948 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:42.746910 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a03f4b8-27ee-4dd2-8150-2b73a73e4f06-cert\") pod \"ingress-canary-xnnjq\" (UID: \"4a03f4b8-27ee-4dd2-8150-2b73a73e4f06\") " pod="openshift-ingress-canary/ingress-canary-xnnjq" Apr 16 10:05:42.746948 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:42.746958 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86d6efca-ec4a-40d2-a200-6d8dacba5368-metrics-tls\") pod \"dns-default-sh9pl\" (UID: \"86d6efca-ec4a-40d2-a200-6d8dacba5368\") " pod="openshift-dns/dns-default-sh9pl" Apr 16 10:05:42.747396 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:42.747045 2525 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 10:05:42.747396 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:42.747047 2525 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 10:05:42.747396 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:42.747095 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86d6efca-ec4a-40d2-a200-6d8dacba5368-metrics-tls podName:86d6efca-ec4a-40d2-a200-6d8dacba5368 nodeName:}" failed. No retries permitted until 2026-04-16 10:06:14.747081431 +0000 UTC m=+97.346529364 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86d6efca-ec4a-40d2-a200-6d8dacba5368-metrics-tls") pod "dns-default-sh9pl" (UID: "86d6efca-ec4a-40d2-a200-6d8dacba5368") : secret "dns-default-metrics-tls" not found Apr 16 10:05:42.747396 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:42.747108 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a03f4b8-27ee-4dd2-8150-2b73a73e4f06-cert podName:4a03f4b8-27ee-4dd2-8150-2b73a73e4f06 nodeName:}" failed. No retries permitted until 2026-04-16 10:06:14.747102289 +0000 UTC m=+97.346550224 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4a03f4b8-27ee-4dd2-8150-2b73a73e4f06-cert") pod "ingress-canary-xnnjq" (UID: "4a03f4b8-27ee-4dd2-8150-2b73a73e4f06") : secret "canary-serving-cert" not found Apr 16 10:05:43.754811 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:43.754779 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4739af65-2ca2-4e8f-9b0e-ddeac76a9b66-metrics-certs\") pod \"network-metrics-daemon-mczdx\" (UID: \"4739af65-2ca2-4e8f-9b0e-ddeac76a9b66\") " pod="openshift-multus/network-metrics-daemon-mczdx" Apr 16 10:05:43.755207 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:43.754898 2525 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 10:05:43.755207 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:05:43.754951 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4739af65-2ca2-4e8f-9b0e-ddeac76a9b66-metrics-certs podName:4739af65-2ca2-4e8f-9b0e-ddeac76a9b66 nodeName:}" failed. No retries permitted until 2026-04-16 10:06:47.754937425 +0000 UTC m=+130.354385360 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4739af65-2ca2-4e8f-9b0e-ddeac76a9b66-metrics-certs") pod "network-metrics-daemon-mczdx" (UID: "4739af65-2ca2-4e8f-9b0e-ddeac76a9b66") : secret "metrics-daemon-secret" not found Apr 16 10:05:47.166983 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:05:47.166949 2525 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-nz7jd" Apr 16 10:06:14.770735 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:14.770684 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a03f4b8-27ee-4dd2-8150-2b73a73e4f06-cert\") pod \"ingress-canary-xnnjq\" (UID: \"4a03f4b8-27ee-4dd2-8150-2b73a73e4f06\") " pod="openshift-ingress-canary/ingress-canary-xnnjq" Apr 16 10:06:14.771304 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:14.770750 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86d6efca-ec4a-40d2-a200-6d8dacba5368-metrics-tls\") pod \"dns-default-sh9pl\" (UID: \"86d6efca-ec4a-40d2-a200-6d8dacba5368\") " pod="openshift-dns/dns-default-sh9pl" Apr 16 10:06:14.771304 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:06:14.770834 2525 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 10:06:14.771304 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:06:14.770859 2525 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 10:06:14.771304 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:06:14.770907 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a03f4b8-27ee-4dd2-8150-2b73a73e4f06-cert podName:4a03f4b8-27ee-4dd2-8150-2b73a73e4f06 nodeName:}" failed. No retries permitted until 2026-04-16 10:07:18.770890825 +0000 UTC m=+161.370338760 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4a03f4b8-27ee-4dd2-8150-2b73a73e4f06-cert") pod "ingress-canary-xnnjq" (UID: "4a03f4b8-27ee-4dd2-8150-2b73a73e4f06") : secret "canary-serving-cert" not found Apr 16 10:06:14.771304 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:06:14.770922 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86d6efca-ec4a-40d2-a200-6d8dacba5368-metrics-tls podName:86d6efca-ec4a-40d2-a200-6d8dacba5368 nodeName:}" failed. No retries permitted until 2026-04-16 10:07:18.770915281 +0000 UTC m=+161.370363215 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86d6efca-ec4a-40d2-a200-6d8dacba5368-metrics-tls") pod "dns-default-sh9pl" (UID: "86d6efca-ec4a-40d2-a200-6d8dacba5368") : secret "dns-default-metrics-tls" not found Apr 16 10:06:47.434981 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.434952 2525 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-j6qrg"] Apr 16 10:06:47.438021 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.437994 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6qrg" Apr 16 10:06:47.438463 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.438435 2525 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vnkxh"] Apr 16 10:06:47.440969 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.440954 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vnkxh" Apr 16 10:06:47.442026 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.442002 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 10:06:47.442132 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.442117 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-cdv7z\"" Apr 16 10:06:47.442200 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.442137 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 10:06:47.442200 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.442137 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 10:06:47.442200 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.442136 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 10:06:47.442858 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.442844 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 10:06:47.442949 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.442884 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 10:06:47.443008 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.442984 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 10:06:47.443367 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.443349 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-k27mr\"" Apr 16 10:06:47.443668 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.443652 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 10:06:47.449296 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.449275 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-j6qrg"] Apr 16 10:06:47.450234 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.450216 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vnkxh"] Apr 16 10:06:47.545480 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.545449 2525 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-c2rf2"] Apr 16 10:06:47.550374 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.550349 2525 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5ccd57b7c8-vc29b"] Apr 16 10:06:47.550520 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.550487 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-c2rf2" Apr 16 10:06:47.552682 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.552657 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 10:06:47.552808 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.552684 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 10:06:47.552866 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.552820 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-b92kz\"" Apr 16 10:06:47.553002 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.552985 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 10:06:47.553104 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.553096 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" Apr 16 10:06:47.553160 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.553146 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 10:06:47.555262 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.555236 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 10:06:47.555352 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.555245 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-kpfjj\"" Apr 16 10:06:47.555481 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.555462 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 10:06:47.555481 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.555473 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 10:06:47.559569 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.559549 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-c2rf2"] Apr 16 10:06:47.562287 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.562269 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 10:06:47.565012 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.564839 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 10:06:47.571590 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.571571 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5ccd57b7c8-vc29b"] Apr 16 10:06:47.589584 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.589563 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsnwv\" (UniqueName: \"kubernetes.io/projected/c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c-kube-api-access-dsnwv\") pod \"cluster-monitoring-operator-6667474d89-j6qrg\" (UID: \"c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6qrg" Apr 16 10:06:47.589684 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.589594 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e96bf9a9-c840-4751-89c6-13968641abc6-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-vnkxh\" (UID: \"e96bf9a9-c840-4751-89c6-13968641abc6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vnkxh" Apr 16 10:06:47.589684 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.589613 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfvgt\" (UniqueName: \"kubernetes.io/projected/e96bf9a9-c840-4751-89c6-13968641abc6-kube-api-access-hfvgt\") pod \"kube-storage-version-migrator-operator-756bb7d76f-vnkxh\" (UID: \"e96bf9a9-c840-4751-89c6-13968641abc6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vnkxh" Apr 16 10:06:47.589684 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.589638 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-j6qrg\" (UID: \"c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6qrg" Apr 16 10:06:47.589684 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.589653 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-j6qrg\" (UID: \"c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6qrg" Apr 16 10:06:47.589828 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.589749 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e96bf9a9-c840-4751-89c6-13968641abc6-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-vnkxh\" (UID: \"e96bf9a9-c840-4751-89c6-13968641abc6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vnkxh" Apr 16 10:06:47.691043 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.690949 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5a082bd3-2d68-4bba-a387-faa89f319dac-trusted-ca\") pod \"image-registry-5ccd57b7c8-vc29b\" (UID: \"5a082bd3-2d68-4bba-a387-faa89f319dac\") " pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" Apr 16 10:06:47.691043 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.690989 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-j6qrg\" (UID: \"c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6qrg" Apr 16 10:06:47.691043 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.691010 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-j6qrg\" (UID: \"c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6qrg" Apr 16 10:06:47.691043 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.691028 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8603601a-c2cf-4413-9cb5-1801baafd774-serving-cert\") pod \"console-operator-d87b8d5fc-c2rf2\" (UID: \"8603601a-c2cf-4413-9cb5-1801baafd774\") " pod="openshift-console-operator/console-operator-d87b8d5fc-c2rf2" Apr 16 10:06:47.691363 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.691066 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e96bf9a9-c840-4751-89c6-13968641abc6-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-vnkxh\" (UID: \"e96bf9a9-c840-4751-89c6-13968641abc6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vnkxh" Apr 16 10:06:47.691363 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:06:47.691129 2525 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 10:06:47.691363 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:06:47.691207 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c-cluster-monitoring-operator-tls podName:c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c nodeName:}" failed. No retries permitted until 2026-04-16 10:06:48.191187058 +0000 UTC m=+130.790634992 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-j6qrg" (UID: "c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c") : secret "cluster-monitoring-operator-tls" not found Apr 16 10:06:47.691363 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.691234 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5a082bd3-2d68-4bba-a387-faa89f319dac-registry-tls\") pod \"image-registry-5ccd57b7c8-vc29b\" (UID: \"5a082bd3-2d68-4bba-a387-faa89f319dac\") " pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" Apr 16 10:06:47.691363 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.691263 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5a082bd3-2d68-4bba-a387-faa89f319dac-bound-sa-token\") pod \"image-registry-5ccd57b7c8-vc29b\" (UID: \"5a082bd3-2d68-4bba-a387-faa89f319dac\") " pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" Apr 16 10:06:47.691363 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.691309 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8603601a-c2cf-4413-9cb5-1801baafd774-trusted-ca\") pod \"console-operator-d87b8d5fc-c2rf2\" (UID: \"8603601a-c2cf-4413-9cb5-1801baafd774\") " pod="openshift-console-operator/console-operator-d87b8d5fc-c2rf2" Apr 16 10:06:47.691363 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.691336 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fjpn\" (UniqueName: \"kubernetes.io/projected/8603601a-c2cf-4413-9cb5-1801baafd774-kube-api-access-9fjpn\") pod \"console-operator-d87b8d5fc-c2rf2\" (UID: \"8603601a-c2cf-4413-9cb5-1801baafd774\") " pod="openshift-console-operator/console-operator-d87b8d5fc-c2rf2" Apr 16 10:06:47.691746 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.691387 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5a082bd3-2d68-4bba-a387-faa89f319dac-ca-trust-extracted\") pod \"image-registry-5ccd57b7c8-vc29b\" (UID: \"5a082bd3-2d68-4bba-a387-faa89f319dac\") " pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" Apr 16 10:06:47.691746 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.691423 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dsnwv\" (UniqueName: \"kubernetes.io/projected/c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c-kube-api-access-dsnwv\") pod \"cluster-monitoring-operator-6667474d89-j6qrg\" (UID: \"c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6qrg" Apr 16 10:06:47.691746 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.691452 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5a082bd3-2d68-4bba-a387-faa89f319dac-installation-pull-secrets\") pod \"image-registry-5ccd57b7c8-vc29b\" (UID: \"5a082bd3-2d68-4bba-a387-faa89f319dac\") " pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" Apr 16 10:06:47.691746 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.691483 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8603601a-c2cf-4413-9cb5-1801baafd774-config\") pod \"console-operator-d87b8d5fc-c2rf2\" (UID: \"8603601a-c2cf-4413-9cb5-1801baafd774\") " pod="openshift-console-operator/console-operator-d87b8d5fc-c2rf2" Apr 16 10:06:47.691746 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.691538 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e96bf9a9-c840-4751-89c6-13968641abc6-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-vnkxh\" (UID: \"e96bf9a9-c840-4751-89c6-13968641abc6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vnkxh" Apr 16 10:06:47.691746 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.691596 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5a082bd3-2d68-4bba-a387-faa89f319dac-image-registry-private-configuration\") pod \"image-registry-5ccd57b7c8-vc29b\" (UID: \"5a082bd3-2d68-4bba-a387-faa89f319dac\") " pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" Apr 16 10:06:47.691746 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.691621 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5a082bd3-2d68-4bba-a387-faa89f319dac-registry-certificates\") pod \"image-registry-5ccd57b7c8-vc29b\" (UID: \"5a082bd3-2d68-4bba-a387-faa89f319dac\") " pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" Apr 16 10:06:47.691746 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.691655 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hfvgt\" (UniqueName: \"kubernetes.io/projected/e96bf9a9-c840-4751-89c6-13968641abc6-kube-api-access-hfvgt\") pod \"kube-storage-version-migrator-operator-756bb7d76f-vnkxh\" (UID: \"e96bf9a9-c840-4751-89c6-13968641abc6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vnkxh" Apr 16 10:06:47.691746 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.691680 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w4pl\" (UniqueName: \"kubernetes.io/projected/5a082bd3-2d68-4bba-a387-faa89f319dac-kube-api-access-5w4pl\") pod \"image-registry-5ccd57b7c8-vc29b\" (UID: \"5a082bd3-2d68-4bba-a387-faa89f319dac\") " pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" Apr 16 10:06:47.692158 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.692070 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e96bf9a9-c840-4751-89c6-13968641abc6-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-vnkxh\" (UID: \"e96bf9a9-c840-4751-89c6-13968641abc6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vnkxh" Apr 16 10:06:47.692312 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.692295 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-j6qrg\" (UID: \"c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6qrg" Apr 16 10:06:47.693270 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.693254 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e96bf9a9-c840-4751-89c6-13968641abc6-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-vnkxh\" (UID: \"e96bf9a9-c840-4751-89c6-13968641abc6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vnkxh" Apr 16 10:06:47.701444 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.701421 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfvgt\" (UniqueName: \"kubernetes.io/projected/e96bf9a9-c840-4751-89c6-13968641abc6-kube-api-access-hfvgt\") pod \"kube-storage-version-migrator-operator-756bb7d76f-vnkxh\" (UID: \"e96bf9a9-c840-4751-89c6-13968641abc6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vnkxh" Apr 16 10:06:47.701595 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.701581 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsnwv\" (UniqueName: \"kubernetes.io/projected/c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c-kube-api-access-dsnwv\") pod \"cluster-monitoring-operator-6667474d89-j6qrg\" (UID: \"c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6qrg" Apr 16 10:06:47.753753 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.753723 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vnkxh" Apr 16 10:06:47.791998 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.791968 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5w4pl\" (UniqueName: \"kubernetes.io/projected/5a082bd3-2d68-4bba-a387-faa89f319dac-kube-api-access-5w4pl\") pod \"image-registry-5ccd57b7c8-vc29b\" (UID: \"5a082bd3-2d68-4bba-a387-faa89f319dac\") " pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" Apr 16 10:06:47.792156 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.792013 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5a082bd3-2d68-4bba-a387-faa89f319dac-trusted-ca\") pod \"image-registry-5ccd57b7c8-vc29b\" (UID: \"5a082bd3-2d68-4bba-a387-faa89f319dac\") " pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" Apr 16 10:06:47.792289 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.792268 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8603601a-c2cf-4413-9cb5-1801baafd774-serving-cert\") pod \"console-operator-d87b8d5fc-c2rf2\" (UID: \"8603601a-c2cf-4413-9cb5-1801baafd774\") " pod="openshift-console-operator/console-operator-d87b8d5fc-c2rf2" Apr 16 10:06:47.792354 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.792322 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5a082bd3-2d68-4bba-a387-faa89f319dac-registry-tls\") pod \"image-registry-5ccd57b7c8-vc29b\" (UID: \"5a082bd3-2d68-4bba-a387-faa89f319dac\") " pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" Apr 16 10:06:47.792354 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.792346 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5a082bd3-2d68-4bba-a387-faa89f319dac-bound-sa-token\") pod \"image-registry-5ccd57b7c8-vc29b\" (UID: \"5a082bd3-2d68-4bba-a387-faa89f319dac\") " pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" Apr 16 10:06:47.792458 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.792364 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4739af65-2ca2-4e8f-9b0e-ddeac76a9b66-metrics-certs\") pod \"network-metrics-daemon-mczdx\" (UID: \"4739af65-2ca2-4e8f-9b0e-ddeac76a9b66\") " pod="openshift-multus/network-metrics-daemon-mczdx" Apr 16 10:06:47.792458 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.792382 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8603601a-c2cf-4413-9cb5-1801baafd774-trusted-ca\") pod \"console-operator-d87b8d5fc-c2rf2\" (UID: \"8603601a-c2cf-4413-9cb5-1801baafd774\") " pod="openshift-console-operator/console-operator-d87b8d5fc-c2rf2" Apr 16 10:06:47.792458 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.792408 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9fjpn\" (UniqueName: \"kubernetes.io/projected/8603601a-c2cf-4413-9cb5-1801baafd774-kube-api-access-9fjpn\") pod \"console-operator-d87b8d5fc-c2rf2\" (UID: \"8603601a-c2cf-4413-9cb5-1801baafd774\") " pod="openshift-console-operator/console-operator-d87b8d5fc-c2rf2" Apr 16 10:06:47.792458 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:06:47.792421 2525 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 10:06:47.792458 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:06:47.792440 2525 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5ccd57b7c8-vc29b: secret "image-registry-tls" not found Apr 16 10:06:47.792722 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:06:47.792475 2525 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 10:06:47.792722 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:06:47.792530 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5a082bd3-2d68-4bba-a387-faa89f319dac-registry-tls podName:5a082bd3-2d68-4bba-a387-faa89f319dac nodeName:}" failed. No retries permitted until 2026-04-16 10:06:48.292487705 +0000 UTC m=+130.891935662 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5a082bd3-2d68-4bba-a387-faa89f319dac-registry-tls") pod "image-registry-5ccd57b7c8-vc29b" (UID: "5a082bd3-2d68-4bba-a387-faa89f319dac") : secret "image-registry-tls" not found Apr 16 10:06:47.792722 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:06:47.792550 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4739af65-2ca2-4e8f-9b0e-ddeac76a9b66-metrics-certs podName:4739af65-2ca2-4e8f-9b0e-ddeac76a9b66 nodeName:}" failed. No retries permitted until 2026-04-16 10:08:49.792540426 +0000 UTC m=+252.391988366 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4739af65-2ca2-4e8f-9b0e-ddeac76a9b66-metrics-certs") pod "network-metrics-daemon-mczdx" (UID: "4739af65-2ca2-4e8f-9b0e-ddeac76a9b66") : secret "metrics-daemon-secret" not found Apr 16 10:06:47.792722 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.792578 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5a082bd3-2d68-4bba-a387-faa89f319dac-ca-trust-extracted\") pod \"image-registry-5ccd57b7c8-vc29b\" (UID: \"5a082bd3-2d68-4bba-a387-faa89f319dac\") " pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" Apr 16 10:06:47.792722 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.792619 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5a082bd3-2d68-4bba-a387-faa89f319dac-installation-pull-secrets\") pod \"image-registry-5ccd57b7c8-vc29b\" (UID: \"5a082bd3-2d68-4bba-a387-faa89f319dac\") " pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" Apr 16 10:06:47.792722 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.792649 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8603601a-c2cf-4413-9cb5-1801baafd774-config\") pod \"console-operator-d87b8d5fc-c2rf2\" (UID: \"8603601a-c2cf-4413-9cb5-1801baafd774\") " pod="openshift-console-operator/console-operator-d87b8d5fc-c2rf2" Apr 16 10:06:47.792722 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.792680 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5a082bd3-2d68-4bba-a387-faa89f319dac-image-registry-private-configuration\") pod \"image-registry-5ccd57b7c8-vc29b\" (UID: \"5a082bd3-2d68-4bba-a387-faa89f319dac\") " pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" Apr 16 10:06:47.792722 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.792722 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5a082bd3-2d68-4bba-a387-faa89f319dac-registry-certificates\") pod \"image-registry-5ccd57b7c8-vc29b\" (UID: \"5a082bd3-2d68-4bba-a387-faa89f319dac\") " pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" Apr 16 10:06:47.793112 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.792937 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5a082bd3-2d68-4bba-a387-faa89f319dac-ca-trust-extracted\") pod \"image-registry-5ccd57b7c8-vc29b\" (UID: \"5a082bd3-2d68-4bba-a387-faa89f319dac\") " pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" Apr 16 10:06:47.793330 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.793285 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5a082bd3-2d68-4bba-a387-faa89f319dac-trusted-ca\") pod \"image-registry-5ccd57b7c8-vc29b\" (UID: \"5a082bd3-2d68-4bba-a387-faa89f319dac\") " pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" Apr 16 10:06:47.793330 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.793308 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8603601a-c2cf-4413-9cb5-1801baafd774-config\") pod \"console-operator-d87b8d5fc-c2rf2\" (UID: \"8603601a-c2cf-4413-9cb5-1801baafd774\") " pod="openshift-console-operator/console-operator-d87b8d5fc-c2rf2" Apr 16 10:06:47.793330 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.793327 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8603601a-c2cf-4413-9cb5-1801baafd774-trusted-ca\") pod \"console-operator-d87b8d5fc-c2rf2\" (UID: \"8603601a-c2cf-4413-9cb5-1801baafd774\") " pod="openshift-console-operator/console-operator-d87b8d5fc-c2rf2" Apr 16 10:06:47.793548 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.793396 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5a082bd3-2d68-4bba-a387-faa89f319dac-registry-certificates\") pod \"image-registry-5ccd57b7c8-vc29b\" (UID: \"5a082bd3-2d68-4bba-a387-faa89f319dac\") " pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" Apr 16 10:06:47.795014 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.794989 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5a082bd3-2d68-4bba-a387-faa89f319dac-installation-pull-secrets\") pod \"image-registry-5ccd57b7c8-vc29b\" (UID: \"5a082bd3-2d68-4bba-a387-faa89f319dac\") " pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" Apr 16 10:06:47.795124 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.795031 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8603601a-c2cf-4413-9cb5-1801baafd774-serving-cert\") pod \"console-operator-d87b8d5fc-c2rf2\" (UID: \"8603601a-c2cf-4413-9cb5-1801baafd774\") " pod="openshift-console-operator/console-operator-d87b8d5fc-c2rf2" Apr 16 10:06:47.795187 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.795157 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5a082bd3-2d68-4bba-a387-faa89f319dac-image-registry-private-configuration\") pod \"image-registry-5ccd57b7c8-vc29b\" (UID: \"5a082bd3-2d68-4bba-a387-faa89f319dac\") " pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" Apr 16 10:06:47.802104 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.802079 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5a082bd3-2d68-4bba-a387-faa89f319dac-bound-sa-token\") pod \"image-registry-5ccd57b7c8-vc29b\" (UID: \"5a082bd3-2d68-4bba-a387-faa89f319dac\") " pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" Apr 16 10:06:47.802216 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.802201 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w4pl\" (UniqueName: \"kubernetes.io/projected/5a082bd3-2d68-4bba-a387-faa89f319dac-kube-api-access-5w4pl\") pod \"image-registry-5ccd57b7c8-vc29b\" (UID: \"5a082bd3-2d68-4bba-a387-faa89f319dac\") " pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" Apr 16 10:06:47.802277 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.802238 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fjpn\" (UniqueName: \"kubernetes.io/projected/8603601a-c2cf-4413-9cb5-1801baafd774-kube-api-access-9fjpn\") pod \"console-operator-d87b8d5fc-c2rf2\" (UID: \"8603601a-c2cf-4413-9cb5-1801baafd774\") " pod="openshift-console-operator/console-operator-d87b8d5fc-c2rf2" Apr 16 10:06:47.861251 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.861220 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-c2rf2" Apr 16 10:06:47.865337 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.865313 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vnkxh"] Apr 16 10:06:47.868557 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:06:47.868532 2525 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode96bf9a9_c840_4751_89c6_13968641abc6.slice/crio-af670c48f7664ebaebfb966d002b3513b6b5535cdc9ba5c36cfe45bc6750e586 WatchSource:0}: Error finding container af670c48f7664ebaebfb966d002b3513b6b5535cdc9ba5c36cfe45bc6750e586: Status 404 returned error can't find the container with id af670c48f7664ebaebfb966d002b3513b6b5535cdc9ba5c36cfe45bc6750e586 Apr 16 10:06:47.973776 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:47.973707 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-c2rf2"] Apr 16 10:06:47.977142 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:06:47.977117 2525 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8603601a_c2cf_4413_9cb5_1801baafd774.slice/crio-90a77c4d97d2ff541034a26777024d4009f93f7ff96acece54f4d2968bf71763 WatchSource:0}: Error finding container 90a77c4d97d2ff541034a26777024d4009f93f7ff96acece54f4d2968bf71763: Status 404 returned error can't find the container with id 90a77c4d97d2ff541034a26777024d4009f93f7ff96acece54f4d2968bf71763 Apr 16 10:06:48.195746 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:48.195714 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-j6qrg\" (UID: \"c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6qrg" Apr 16 10:06:48.195888 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:06:48.195813 2525 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 10:06:48.195888 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:06:48.195859 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c-cluster-monitoring-operator-tls podName:c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c nodeName:}" failed. No retries permitted until 2026-04-16 10:06:49.195846439 +0000 UTC m=+131.795294373 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-j6qrg" (UID: "c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c") : secret "cluster-monitoring-operator-tls" not found Apr 16 10:06:48.296121 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:48.296035 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5a082bd3-2d68-4bba-a387-faa89f319dac-registry-tls\") pod \"image-registry-5ccd57b7c8-vc29b\" (UID: \"5a082bd3-2d68-4bba-a387-faa89f319dac\") " pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" Apr 16 10:06:48.296245 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:06:48.296178 2525 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 10:06:48.296245 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:06:48.296195 2525 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5ccd57b7c8-vc29b: secret "image-registry-tls" not found Apr 16 10:06:48.296320 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:06:48.296248 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5a082bd3-2d68-4bba-a387-faa89f319dac-registry-tls podName:5a082bd3-2d68-4bba-a387-faa89f319dac nodeName:}" failed. No retries permitted until 2026-04-16 10:06:49.296234256 +0000 UTC m=+131.895682190 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5a082bd3-2d68-4bba-a387-faa89f319dac-registry-tls") pod "image-registry-5ccd57b7c8-vc29b" (UID: "5a082bd3-2d68-4bba-a387-faa89f319dac") : secret "image-registry-tls" not found Apr 16 10:06:48.336328 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:48.336290 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-c2rf2" event={"ID":"8603601a-c2cf-4413-9cb5-1801baafd774","Type":"ContainerStarted","Data":"90a77c4d97d2ff541034a26777024d4009f93f7ff96acece54f4d2968bf71763"} Apr 16 10:06:48.337118 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:48.337102 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vnkxh" event={"ID":"e96bf9a9-c840-4751-89c6-13968641abc6","Type":"ContainerStarted","Data":"af670c48f7664ebaebfb966d002b3513b6b5535cdc9ba5c36cfe45bc6750e586"} Apr 16 10:06:49.203329 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:49.203287 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-j6qrg\" (UID: \"c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6qrg" Apr 16 10:06:49.203816 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:06:49.203430 2525 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 10:06:49.203816 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:06:49.203537 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c-cluster-monitoring-operator-tls podName:c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c nodeName:}" failed. No retries permitted until 2026-04-16 10:06:51.203499455 +0000 UTC m=+133.802947404 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-j6qrg" (UID: "c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c") : secret "cluster-monitoring-operator-tls" not found Apr 16 10:06:49.303967 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:49.303933 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5a082bd3-2d68-4bba-a387-faa89f319dac-registry-tls\") pod \"image-registry-5ccd57b7c8-vc29b\" (UID: \"5a082bd3-2d68-4bba-a387-faa89f319dac\") " pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" Apr 16 10:06:49.304128 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:06:49.304094 2525 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 10:06:49.304128 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:06:49.304109 2525 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5ccd57b7c8-vc29b: secret "image-registry-tls" not found Apr 16 10:06:49.304251 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:06:49.304167 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5a082bd3-2d68-4bba-a387-faa89f319dac-registry-tls podName:5a082bd3-2d68-4bba-a387-faa89f319dac nodeName:}" failed. No retries permitted until 2026-04-16 10:06:51.304152655 +0000 UTC m=+133.903600596 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5a082bd3-2d68-4bba-a387-faa89f319dac-registry-tls") pod "image-registry-5ccd57b7c8-vc29b" (UID: "5a082bd3-2d68-4bba-a387-faa89f319dac") : secret "image-registry-tls" not found Apr 16 10:06:51.218648 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:51.218613 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-j6qrg\" (UID: \"c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6qrg" Apr 16 10:06:51.219046 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:06:51.218736 2525 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 10:06:51.219046 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:06:51.218788 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c-cluster-monitoring-operator-tls podName:c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c nodeName:}" failed. No retries permitted until 2026-04-16 10:06:55.218774759 +0000 UTC m=+137.818222694 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-j6qrg" (UID: "c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c") : secret "cluster-monitoring-operator-tls" not found Apr 16 10:06:51.319288 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:51.319250 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5a082bd3-2d68-4bba-a387-faa89f319dac-registry-tls\") pod \"image-registry-5ccd57b7c8-vc29b\" (UID: \"5a082bd3-2d68-4bba-a387-faa89f319dac\") " pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" Apr 16 10:06:51.319451 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:06:51.319416 2525 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 10:06:51.319451 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:06:51.319438 2525 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5ccd57b7c8-vc29b: secret "image-registry-tls" not found Apr 16 10:06:51.319581 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:06:51.319524 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5a082bd3-2d68-4bba-a387-faa89f319dac-registry-tls podName:5a082bd3-2d68-4bba-a387-faa89f319dac nodeName:}" failed. No retries permitted until 2026-04-16 10:06:55.319487571 +0000 UTC m=+137.918935506 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5a082bd3-2d68-4bba-a387-faa89f319dac-registry-tls") pod "image-registry-5ccd57b7c8-vc29b" (UID: "5a082bd3-2d68-4bba-a387-faa89f319dac") : secret "image-registry-tls" not found Apr 16 10:06:51.344074 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:51.344053 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-c2rf2_8603601a-c2cf-4413-9cb5-1801baafd774/console-operator/0.log" Apr 16 10:06:51.344199 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:51.344088 2525 generic.go:358] "Generic (PLEG): container finished" podID="8603601a-c2cf-4413-9cb5-1801baafd774" containerID="7a40e4bcb633fadfda1e0e9f5ea6f22153ff16edd048035397fb98ea53a9f2c5" exitCode=255 Apr 16 10:06:51.344199 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:51.344117 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-c2rf2" event={"ID":"8603601a-c2cf-4413-9cb5-1801baafd774","Type":"ContainerDied","Data":"7a40e4bcb633fadfda1e0e9f5ea6f22153ff16edd048035397fb98ea53a9f2c5"} Apr 16 10:06:51.344407 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:51.344390 2525 scope.go:117] "RemoveContainer" containerID="7a40e4bcb633fadfda1e0e9f5ea6f22153ff16edd048035397fb98ea53a9f2c5" Apr 16 10:06:51.345526 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:51.345491 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vnkxh" event={"ID":"e96bf9a9-c840-4751-89c6-13968641abc6","Type":"ContainerStarted","Data":"16321ee4d81de3ab50a0358dd308c7ea1c32bc7e53f9ae0f2abe20e4d5b5d1dc"} Apr 16 10:06:51.373081 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:51.373042 2525 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vnkxh" podStartSLOduration=1.912733953 podStartE2EDuration="4.373028905s" podCreationTimestamp="2026-04-16 10:06:47 +0000 UTC" firstStartedPulling="2026-04-16 10:06:47.870453415 +0000 UTC m=+130.469901349" lastFinishedPulling="2026-04-16 10:06:50.330748353 +0000 UTC m=+132.930196301" observedRunningTime="2026-04-16 10:06:51.372798639 +0000 UTC m=+133.972246610" watchObservedRunningTime="2026-04-16 10:06:51.373028905 +0000 UTC m=+133.972476838" Apr 16 10:06:51.517995 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:51.517919 2525 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-6t6gn"] Apr 16 10:06:51.520817 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:51.520794 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-6t6gn" Apr 16 10:06:51.523378 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:51.523356 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-mvhcg\"" Apr 16 10:06:51.523378 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:51.523371 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 10:06:51.523593 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:51.523381 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 10:06:51.528973 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:51.528940 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-6t6gn"] Apr 16 10:06:51.622201 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:51.622170 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6c9z\" (UniqueName: \"kubernetes.io/projected/63ea1f8d-2d65-47e6-84cc-63d854cc64de-kube-api-access-f6c9z\") pod \"migrator-64d4d94569-6t6gn\" (UID: \"63ea1f8d-2d65-47e6-84cc-63d854cc64de\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-6t6gn" Apr 16 10:06:51.722775 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:51.722724 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f6c9z\" (UniqueName: \"kubernetes.io/projected/63ea1f8d-2d65-47e6-84cc-63d854cc64de-kube-api-access-f6c9z\") pod \"migrator-64d4d94569-6t6gn\" (UID: \"63ea1f8d-2d65-47e6-84cc-63d854cc64de\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-6t6gn" Apr 16 10:06:51.730269 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:51.730240 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6c9z\" (UniqueName: \"kubernetes.io/projected/63ea1f8d-2d65-47e6-84cc-63d854cc64de-kube-api-access-f6c9z\") pod \"migrator-64d4d94569-6t6gn\" (UID: \"63ea1f8d-2d65-47e6-84cc-63d854cc64de\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-6t6gn" Apr 16 10:06:51.830352 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:51.830313 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-6t6gn" Apr 16 10:06:51.942239 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:51.942208 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-6t6gn"] Apr 16 10:06:51.945016 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:06:51.944993 2525 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63ea1f8d_2d65_47e6_84cc_63d854cc64de.slice/crio-8a04d7a958004e991b6977fdd7e8c4f2ee9511e337ca1e0685c65f76eb9c16a9 WatchSource:0}: Error finding container 8a04d7a958004e991b6977fdd7e8c4f2ee9511e337ca1e0685c65f76eb9c16a9: Status 404 returned error can't find the container with id 8a04d7a958004e991b6977fdd7e8c4f2ee9511e337ca1e0685c65f76eb9c16a9 Apr 16 10:06:52.348060 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:52.348024 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-6t6gn" event={"ID":"63ea1f8d-2d65-47e6-84cc-63d854cc64de","Type":"ContainerStarted","Data":"8a04d7a958004e991b6977fdd7e8c4f2ee9511e337ca1e0685c65f76eb9c16a9"} Apr 16 10:06:52.349303 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:52.349284 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-c2rf2_8603601a-c2cf-4413-9cb5-1801baafd774/console-operator/1.log" Apr 16 10:06:52.349675 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:52.349659 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-c2rf2_8603601a-c2cf-4413-9cb5-1801baafd774/console-operator/0.log" Apr 16 10:06:52.349717 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:52.349693 2525 generic.go:358] "Generic (PLEG): container finished" podID="8603601a-c2cf-4413-9cb5-1801baafd774" containerID="28b9a2c5358608ee0dd9451cb05b9d12bd11eead08e7cddaafc5c841d556d153" exitCode=255 Apr 16 10:06:52.349807 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:52.349786 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-c2rf2" event={"ID":"8603601a-c2cf-4413-9cb5-1801baafd774","Type":"ContainerDied","Data":"28b9a2c5358608ee0dd9451cb05b9d12bd11eead08e7cddaafc5c841d556d153"} Apr 16 10:06:52.349848 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:52.349829 2525 scope.go:117] "RemoveContainer" containerID="7a40e4bcb633fadfda1e0e9f5ea6f22153ff16edd048035397fb98ea53a9f2c5" Apr 16 10:06:52.350036 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:52.350017 2525 scope.go:117] "RemoveContainer" containerID="28b9a2c5358608ee0dd9451cb05b9d12bd11eead08e7cddaafc5c841d556d153" Apr 16 10:06:52.350257 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:06:52.350240 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-c2rf2_openshift-console-operator(8603601a-c2cf-4413-9cb5-1801baafd774)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-c2rf2" podUID="8603601a-c2cf-4413-9cb5-1801baafd774" Apr 16 10:06:53.353297 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:53.353261 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-c2rf2_8603601a-c2cf-4413-9cb5-1801baafd774/console-operator/1.log" Apr 16 10:06:53.353765 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:53.353662 2525 scope.go:117] "RemoveContainer" containerID="28b9a2c5358608ee0dd9451cb05b9d12bd11eead08e7cddaafc5c841d556d153" Apr 16 10:06:53.353856 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:06:53.353834 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-c2rf2_openshift-console-operator(8603601a-c2cf-4413-9cb5-1801baafd774)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-c2rf2" podUID="8603601a-c2cf-4413-9cb5-1801baafd774" Apr 16 10:06:53.354845 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:53.354825 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-6t6gn" event={"ID":"63ea1f8d-2d65-47e6-84cc-63d854cc64de","Type":"ContainerStarted","Data":"6a3c1471cbf89bbb073ef41f30987961b634021694bca4a20b440338b9955f4d"} Apr 16 10:06:53.354845 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:53.354852 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-6t6gn" event={"ID":"63ea1f8d-2d65-47e6-84cc-63d854cc64de","Type":"ContainerStarted","Data":"222c94acb63930662f88b147495354039cc0b1adbeb696244b6edacc420911f6"} Apr 16 10:06:53.391689 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:53.391647 2525 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-6t6gn" podStartSLOduration=1.401134813 podStartE2EDuration="2.391635482s" podCreationTimestamp="2026-04-16 10:06:51 +0000 UTC" firstStartedPulling="2026-04-16 10:06:51.9467852 +0000 UTC m=+134.546233135" lastFinishedPulling="2026-04-16 10:06:52.93728586 +0000 UTC m=+135.536733804" observedRunningTime="2026-04-16 10:06:53.390863729 +0000 UTC m=+135.990311686" watchObservedRunningTime="2026-04-16 10:06:53.391635482 +0000 UTC m=+135.991083437" Apr 16 10:06:53.543233 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:53.543205 2525 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-2cchd"] Apr 16 10:06:53.546048 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:53.546034 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-2cchd" Apr 16 10:06:53.548412 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:53.548388 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-msjcs\"" Apr 16 10:06:53.548553 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:53.548433 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 10:06:53.548553 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:53.548437 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 10:06:53.548670 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:53.548658 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 10:06:53.548760 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:53.548743 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 10:06:53.553152 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:53.553129 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-2cchd"] Apr 16 10:06:53.639626 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:53.639552 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/31b05e1f-02fc-4a2d-8bdb-a96c915c7fe6-signing-key\") pod \"service-ca-bfc587fb7-2cchd\" (UID: \"31b05e1f-02fc-4a2d-8bdb-a96c915c7fe6\") " pod="openshift-service-ca/service-ca-bfc587fb7-2cchd" Apr 16 10:06:53.639747 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:53.639660 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/31b05e1f-02fc-4a2d-8bdb-a96c915c7fe6-signing-cabundle\") pod \"service-ca-bfc587fb7-2cchd\" (UID: \"31b05e1f-02fc-4a2d-8bdb-a96c915c7fe6\") " pod="openshift-service-ca/service-ca-bfc587fb7-2cchd" Apr 16 10:06:53.639747 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:53.639724 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqpdl\" (UniqueName: \"kubernetes.io/projected/31b05e1f-02fc-4a2d-8bdb-a96c915c7fe6-kube-api-access-jqpdl\") pod \"service-ca-bfc587fb7-2cchd\" (UID: \"31b05e1f-02fc-4a2d-8bdb-a96c915c7fe6\") " pod="openshift-service-ca/service-ca-bfc587fb7-2cchd" Apr 16 10:06:53.740960 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:53.740911 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/31b05e1f-02fc-4a2d-8bdb-a96c915c7fe6-signing-cabundle\") pod \"service-ca-bfc587fb7-2cchd\" (UID: \"31b05e1f-02fc-4a2d-8bdb-a96c915c7fe6\") " pod="openshift-service-ca/service-ca-bfc587fb7-2cchd" Apr 16 10:06:53.741061 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:53.740987 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jqpdl\" (UniqueName: \"kubernetes.io/projected/31b05e1f-02fc-4a2d-8bdb-a96c915c7fe6-kube-api-access-jqpdl\") pod \"service-ca-bfc587fb7-2cchd\" (UID: \"31b05e1f-02fc-4a2d-8bdb-a96c915c7fe6\") " pod="openshift-service-ca/service-ca-bfc587fb7-2cchd" Apr 16 10:06:53.741061 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:53.741015 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/31b05e1f-02fc-4a2d-8bdb-a96c915c7fe6-signing-key\") pod \"service-ca-bfc587fb7-2cchd\" (UID: \"31b05e1f-02fc-4a2d-8bdb-a96c915c7fe6\") " pod="openshift-service-ca/service-ca-bfc587fb7-2cchd" Apr 16 10:06:53.741594 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:53.741565 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/31b05e1f-02fc-4a2d-8bdb-a96c915c7fe6-signing-cabundle\") pod \"service-ca-bfc587fb7-2cchd\" (UID: \"31b05e1f-02fc-4a2d-8bdb-a96c915c7fe6\") " pod="openshift-service-ca/service-ca-bfc587fb7-2cchd" Apr 16 10:06:53.743315 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:53.743293 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/31b05e1f-02fc-4a2d-8bdb-a96c915c7fe6-signing-key\") pod \"service-ca-bfc587fb7-2cchd\" (UID: \"31b05e1f-02fc-4a2d-8bdb-a96c915c7fe6\") " pod="openshift-service-ca/service-ca-bfc587fb7-2cchd" Apr 16 10:06:53.748983 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:53.748956 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqpdl\" (UniqueName: \"kubernetes.io/projected/31b05e1f-02fc-4a2d-8bdb-a96c915c7fe6-kube-api-access-jqpdl\") pod \"service-ca-bfc587fb7-2cchd\" (UID: \"31b05e1f-02fc-4a2d-8bdb-a96c915c7fe6\") " pod="openshift-service-ca/service-ca-bfc587fb7-2cchd" Apr 16 10:06:53.759861 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:53.759843 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-scbdk_9c0ede08-fd0f-4aba-9ec1-e44fb8d11bd4/dns-node-resolver/0.log" Apr 16 10:06:53.855121 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:53.855094 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-2cchd" Apr 16 10:06:53.963837 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:53.963808 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-2cchd"] Apr 16 10:06:53.966588 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:06:53.966564 2525 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31b05e1f_02fc_4a2d_8bdb_a96c915c7fe6.slice/crio-1610c9ef94ad5a60048820ef53b08908816561248c1fc30b9b4d4e20fa7ff185 WatchSource:0}: Error finding container 1610c9ef94ad5a60048820ef53b08908816561248c1fc30b9b4d4e20fa7ff185: Status 404 returned error can't find the container with id 1610c9ef94ad5a60048820ef53b08908816561248c1fc30b9b4d4e20fa7ff185 Apr 16 10:06:54.358765 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:54.358732 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-2cchd" event={"ID":"31b05e1f-02fc-4a2d-8bdb-a96c915c7fe6","Type":"ContainerStarted","Data":"1610c9ef94ad5a60048820ef53b08908816561248c1fc30b9b4d4e20fa7ff185"} Apr 16 10:06:54.361177 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:54.361156 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-hhqrb_3912faf1-da9e-41de-8d4c-b2cdb354f252/node-ca/0.log" Apr 16 10:06:55.253995 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:55.253957 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-j6qrg\" (UID: \"c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6qrg" Apr 16 10:06:55.254155 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:06:55.254136 2525 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 10:06:55.254236 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:06:55.254224 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c-cluster-monitoring-operator-tls podName:c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c nodeName:}" failed. No retries permitted until 2026-04-16 10:07:03.25420233 +0000 UTC m=+145.853650264 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-j6qrg" (UID: "c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c") : secret "cluster-monitoring-operator-tls" not found Apr 16 10:06:55.354391 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:55.354351 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5a082bd3-2d68-4bba-a387-faa89f319dac-registry-tls\") pod \"image-registry-5ccd57b7c8-vc29b\" (UID: \"5a082bd3-2d68-4bba-a387-faa89f319dac\") " pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" Apr 16 10:06:55.354576 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:06:55.354550 2525 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 10:06:55.354576 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:06:55.354571 2525 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5ccd57b7c8-vc29b: secret "image-registry-tls" not found Apr 16 10:06:55.354689 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:06:55.354654 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5a082bd3-2d68-4bba-a387-faa89f319dac-registry-tls podName:5a082bd3-2d68-4bba-a387-faa89f319dac nodeName:}" failed. No retries permitted until 2026-04-16 10:07:03.354633555 +0000 UTC m=+145.954081541 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5a082bd3-2d68-4bba-a387-faa89f319dac-registry-tls") pod "image-registry-5ccd57b7c8-vc29b" (UID: "5a082bd3-2d68-4bba-a387-faa89f319dac") : secret "image-registry-tls" not found Apr 16 10:06:56.368111 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:56.368073 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-2cchd" event={"ID":"31b05e1f-02fc-4a2d-8bdb-a96c915c7fe6","Type":"ContainerStarted","Data":"0ff1333598139f008171beb99195bc9d586e04c7003e2f9d918b189fc962db0f"} Apr 16 10:06:57.861501 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:57.861453 2525 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-c2rf2" Apr 16 10:06:57.861501 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:57.861499 2525 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-c2rf2" Apr 16 10:06:57.861905 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:06:57.861835 2525 scope.go:117] "RemoveContainer" containerID="28b9a2c5358608ee0dd9451cb05b9d12bd11eead08e7cddaafc5c841d556d153" Apr 16 10:06:57.861994 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:06:57.861978 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-c2rf2_openshift-console-operator(8603601a-c2cf-4413-9cb5-1801baafd774)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-c2rf2" podUID="8603601a-c2cf-4413-9cb5-1801baafd774" Apr 16 10:07:03.316832 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:03.316799 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-j6qrg\" (UID: \"c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6qrg" Apr 16 10:07:03.317182 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:07:03.316909 2525 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 10:07:03.317182 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:07:03.316961 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c-cluster-monitoring-operator-tls podName:c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c nodeName:}" failed. No retries permitted until 2026-04-16 10:07:19.316946009 +0000 UTC m=+161.916393943 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-j6qrg" (UID: "c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c") : secret "cluster-monitoring-operator-tls" not found Apr 16 10:07:03.418096 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:03.418066 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5a082bd3-2d68-4bba-a387-faa89f319dac-registry-tls\") pod \"image-registry-5ccd57b7c8-vc29b\" (UID: \"5a082bd3-2d68-4bba-a387-faa89f319dac\") " pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" Apr 16 10:07:03.420252 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:03.420225 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5a082bd3-2d68-4bba-a387-faa89f319dac-registry-tls\") pod \"image-registry-5ccd57b7c8-vc29b\" (UID: \"5a082bd3-2d68-4bba-a387-faa89f319dac\") " pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" Apr 16 10:07:03.467995 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:03.467966 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" Apr 16 10:07:03.597961 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:03.597867 2525 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-2cchd" podStartSLOduration=8.971961938 podStartE2EDuration="10.597849747s" podCreationTimestamp="2026-04-16 10:06:53 +0000 UTC" firstStartedPulling="2026-04-16 10:06:53.968316094 +0000 UTC m=+136.567764027" lastFinishedPulling="2026-04-16 10:06:55.594203897 +0000 UTC m=+138.193651836" observedRunningTime="2026-04-16 10:06:56.385502543 +0000 UTC m=+138.984950498" watchObservedRunningTime="2026-04-16 10:07:03.597849747 +0000 UTC m=+146.197297702" Apr 16 10:07:03.598124 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:03.598086 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5ccd57b7c8-vc29b"] Apr 16 10:07:03.601568 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:07:03.601537 2525 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a082bd3_2d68_4bba_a387_faa89f319dac.slice/crio-45d7dbe60348ba1237d1c147dcd0b4620224470606647cf4fa7fcc00e9bf9991 WatchSource:0}: Error finding container 45d7dbe60348ba1237d1c147dcd0b4620224470606647cf4fa7fcc00e9bf9991: Status 404 returned error can't find the container with id 45d7dbe60348ba1237d1c147dcd0b4620224470606647cf4fa7fcc00e9bf9991 Apr 16 10:07:04.388371 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:04.388341 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" event={"ID":"5a082bd3-2d68-4bba-a387-faa89f319dac","Type":"ContainerStarted","Data":"6a50f8ca0c1a1e17e32c9bdfa14b81dc5b8c4672ef1d0a05aa88b888c4934786"} Apr 16 10:07:04.388371 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:04.388375 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" event={"ID":"5a082bd3-2d68-4bba-a387-faa89f319dac","Type":"ContainerStarted","Data":"45d7dbe60348ba1237d1c147dcd0b4620224470606647cf4fa7fcc00e9bf9991"} Apr 16 10:07:04.388801 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:04.388477 2525 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" Apr 16 10:07:04.433235 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:04.433187 2525 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" podStartSLOduration=17.433171301 podStartE2EDuration="17.433171301s" podCreationTimestamp="2026-04-16 10:06:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 10:07:04.431932139 +0000 UTC m=+147.031380093" watchObservedRunningTime="2026-04-16 10:07:04.433171301 +0000 UTC m=+147.032619257" Apr 16 10:07:09.988235 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:09.988204 2525 scope.go:117] "RemoveContainer" containerID="28b9a2c5358608ee0dd9451cb05b9d12bd11eead08e7cddaafc5c841d556d153" Apr 16 10:07:10.403265 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:10.403240 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-c2rf2_8603601a-c2cf-4413-9cb5-1801baafd774/console-operator/2.log" Apr 16 10:07:10.403617 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:10.403602 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-c2rf2_8603601a-c2cf-4413-9cb5-1801baafd774/console-operator/1.log" Apr 16 10:07:10.403688 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:10.403633 2525 generic.go:358] "Generic (PLEG): container finished" podID="8603601a-c2cf-4413-9cb5-1801baafd774" containerID="160c8c30866dd4393f0fb1643e7e77e5b04f7534fa67b4416575507ba3571632" exitCode=255 Apr 16 10:07:10.403688 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:10.403671 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-c2rf2" event={"ID":"8603601a-c2cf-4413-9cb5-1801baafd774","Type":"ContainerDied","Data":"160c8c30866dd4393f0fb1643e7e77e5b04f7534fa67b4416575507ba3571632"} Apr 16 10:07:10.403796 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:10.403697 2525 scope.go:117] "RemoveContainer" containerID="28b9a2c5358608ee0dd9451cb05b9d12bd11eead08e7cddaafc5c841d556d153" Apr 16 10:07:10.404032 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:10.404015 2525 scope.go:117] "RemoveContainer" containerID="160c8c30866dd4393f0fb1643e7e77e5b04f7534fa67b4416575507ba3571632" Apr 16 10:07:10.404227 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:07:10.404208 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-c2rf2_openshift-console-operator(8603601a-c2cf-4413-9cb5-1801baafd774)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-c2rf2" podUID="8603601a-c2cf-4413-9cb5-1801baafd774" Apr 16 10:07:11.407210 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:11.407182 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-c2rf2_8603601a-c2cf-4413-9cb5-1801baafd774/console-operator/2.log" Apr 16 10:07:13.864075 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:07:13.864029 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-sh9pl" podUID="86d6efca-ec4a-40d2-a200-6d8dacba5368" Apr 16 10:07:13.878214 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:07:13.878178 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-xnnjq" podUID="4a03f4b8-27ee-4dd2-8150-2b73a73e4f06" Apr 16 10:07:13.886535 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:13.886495 2525 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-2b85g"] Apr 16 10:07:13.891278 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:13.891263 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2b85g" Apr 16 10:07:13.894000 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:13.893976 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 10:07:13.895111 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:13.895092 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 10:07:13.895196 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:13.895092 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 10:07:13.895196 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:13.895160 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-vnpkd\"" Apr 16 10:07:13.896120 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:13.896101 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 10:07:13.898453 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:13.898433 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1888e23b-9237-4f13-9a55-ad7aa5434c5d-data-volume\") pod \"insights-runtime-extractor-2b85g\" (UID: \"1888e23b-9237-4f13-9a55-ad7aa5434c5d\") " pod="openshift-insights/insights-runtime-extractor-2b85g" Apr 16 10:07:13.898551 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:13.898523 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1888e23b-9237-4f13-9a55-ad7aa5434c5d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2b85g\" (UID: \"1888e23b-9237-4f13-9a55-ad7aa5434c5d\") " pod="openshift-insights/insights-runtime-extractor-2b85g" Apr 16 10:07:13.898623 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:13.898562 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1888e23b-9237-4f13-9a55-ad7aa5434c5d-crio-socket\") pod \"insights-runtime-extractor-2b85g\" (UID: \"1888e23b-9237-4f13-9a55-ad7aa5434c5d\") " pod="openshift-insights/insights-runtime-extractor-2b85g" Apr 16 10:07:13.898623 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:13.898597 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1888e23b-9237-4f13-9a55-ad7aa5434c5d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2b85g\" (UID: \"1888e23b-9237-4f13-9a55-ad7aa5434c5d\") " pod="openshift-insights/insights-runtime-extractor-2b85g" Apr 16 10:07:13.898736 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:13.898642 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-226z8\" (UniqueName: \"kubernetes.io/projected/1888e23b-9237-4f13-9a55-ad7aa5434c5d-kube-api-access-226z8\") pod \"insights-runtime-extractor-2b85g\" (UID: \"1888e23b-9237-4f13-9a55-ad7aa5434c5d\") " pod="openshift-insights/insights-runtime-extractor-2b85g" Apr 16 10:07:13.903454 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:13.903435 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2b85g"] Apr 16 10:07:13.999310 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:13.999278 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-226z8\" (UniqueName: \"kubernetes.io/projected/1888e23b-9237-4f13-9a55-ad7aa5434c5d-kube-api-access-226z8\") pod \"insights-runtime-extractor-2b85g\" (UID: \"1888e23b-9237-4f13-9a55-ad7aa5434c5d\") " pod="openshift-insights/insights-runtime-extractor-2b85g" Apr 16 10:07:13.999310 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:13.999311 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1888e23b-9237-4f13-9a55-ad7aa5434c5d-data-volume\") pod \"insights-runtime-extractor-2b85g\" (UID: \"1888e23b-9237-4f13-9a55-ad7aa5434c5d\") " pod="openshift-insights/insights-runtime-extractor-2b85g" Apr 16 10:07:13.999562 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:13.999366 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1888e23b-9237-4f13-9a55-ad7aa5434c5d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2b85g\" (UID: \"1888e23b-9237-4f13-9a55-ad7aa5434c5d\") " pod="openshift-insights/insights-runtime-extractor-2b85g" Apr 16 10:07:13.999562 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:13.999385 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1888e23b-9237-4f13-9a55-ad7aa5434c5d-crio-socket\") pod \"insights-runtime-extractor-2b85g\" (UID: \"1888e23b-9237-4f13-9a55-ad7aa5434c5d\") " pod="openshift-insights/insights-runtime-extractor-2b85g" Apr 16 10:07:13.999562 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:13.999416 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1888e23b-9237-4f13-9a55-ad7aa5434c5d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2b85g\" (UID: \"1888e23b-9237-4f13-9a55-ad7aa5434c5d\") " pod="openshift-insights/insights-runtime-extractor-2b85g" Apr 16 10:07:13.999562 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:13.999482 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1888e23b-9237-4f13-9a55-ad7aa5434c5d-crio-socket\") pod \"insights-runtime-extractor-2b85g\" (UID: \"1888e23b-9237-4f13-9a55-ad7aa5434c5d\") " pod="openshift-insights/insights-runtime-extractor-2b85g" Apr 16 10:07:13.999774 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:13.999718 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1888e23b-9237-4f13-9a55-ad7aa5434c5d-data-volume\") pod \"insights-runtime-extractor-2b85g\" (UID: \"1888e23b-9237-4f13-9a55-ad7aa5434c5d\") " pod="openshift-insights/insights-runtime-extractor-2b85g" Apr 16 10:07:13.999915 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:13.999895 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1888e23b-9237-4f13-9a55-ad7aa5434c5d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2b85g\" (UID: \"1888e23b-9237-4f13-9a55-ad7aa5434c5d\") " pod="openshift-insights/insights-runtime-extractor-2b85g" Apr 16 10:07:14.001766 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:14.001742 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1888e23b-9237-4f13-9a55-ad7aa5434c5d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2b85g\" (UID: \"1888e23b-9237-4f13-9a55-ad7aa5434c5d\") " pod="openshift-insights/insights-runtime-extractor-2b85g" Apr 16 10:07:14.004677 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:07:14.004650 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-mczdx" podUID="4739af65-2ca2-4e8f-9b0e-ddeac76a9b66" Apr 16 10:07:14.010554 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:14.010532 2525 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5ccd57b7c8-vc29b"] Apr 16 10:07:14.019066 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:14.019043 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-226z8\" (UniqueName: \"kubernetes.io/projected/1888e23b-9237-4f13-9a55-ad7aa5434c5d-kube-api-access-226z8\") pod \"insights-runtime-extractor-2b85g\" (UID: \"1888e23b-9237-4f13-9a55-ad7aa5434c5d\") " pod="openshift-insights/insights-runtime-extractor-2b85g" Apr 16 10:07:14.200007 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:14.199911 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2b85g" Apr 16 10:07:14.314551 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:14.314504 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2b85g"] Apr 16 10:07:14.317661 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:07:14.317633 2525 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1888e23b_9237_4f13_9a55_ad7aa5434c5d.slice/crio-2d1c7516cfc2b333f973bf9b7628c1dbae4c1160d9de85c1b1d73b6fc136712b WatchSource:0}: Error finding container 2d1c7516cfc2b333f973bf9b7628c1dbae4c1160d9de85c1b1d73b6fc136712b: Status 404 returned error can't find the container with id 2d1c7516cfc2b333f973bf9b7628c1dbae4c1160d9de85c1b1d73b6fc136712b Apr 16 10:07:14.416619 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:14.416586 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2b85g" event={"ID":"1888e23b-9237-4f13-9a55-ad7aa5434c5d","Type":"ContainerStarted","Data":"3297be21e77274d1cc5ad61571a8cccf58b3096049ccc121c3fc520c58b878c7"} Apr 16 10:07:14.416797 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:14.416631 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2b85g" event={"ID":"1888e23b-9237-4f13-9a55-ad7aa5434c5d","Type":"ContainerStarted","Data":"2d1c7516cfc2b333f973bf9b7628c1dbae4c1160d9de85c1b1d73b6fc136712b"} Apr 16 10:07:14.416797 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:14.416601 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sh9pl" Apr 16 10:07:15.420342 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:15.420312 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2b85g" event={"ID":"1888e23b-9237-4f13-9a55-ad7aa5434c5d","Type":"ContainerStarted","Data":"53c822935223624dff674db89fd36d3dee3853728eeafe7bae8f277531052363"} Apr 16 10:07:17.427300 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:17.427264 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2b85g" event={"ID":"1888e23b-9237-4f13-9a55-ad7aa5434c5d","Type":"ContainerStarted","Data":"3d45a3c8a04e4aa635c11546ceabccfad407067f506fee5ae780eb43b52bb16b"} Apr 16 10:07:17.442718 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:17.442666 2525 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-2b85g" podStartSLOduration=2.148597158 podStartE2EDuration="4.442652173s" podCreationTimestamp="2026-04-16 10:07:13 +0000 UTC" firstStartedPulling="2026-04-16 10:07:14.373699106 +0000 UTC m=+156.973147054" lastFinishedPulling="2026-04-16 10:07:16.667753919 +0000 UTC m=+159.267202069" observedRunningTime="2026-04-16 10:07:17.442139237 +0000 UTC m=+160.041587192" watchObservedRunningTime="2026-04-16 10:07:17.442652173 +0000 UTC m=+160.042100142" Apr 16 10:07:17.862328 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:17.862295 2525 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-c2rf2" Apr 16 10:07:17.862328 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:17.862328 2525 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-c2rf2" Apr 16 10:07:17.862722 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:17.862698 2525 scope.go:117] "RemoveContainer" containerID="160c8c30866dd4393f0fb1643e7e77e5b04f7534fa67b4416575507ba3571632" Apr 16 10:07:17.862911 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:07:17.862892 2525 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-c2rf2_openshift-console-operator(8603601a-c2cf-4413-9cb5-1801baafd774)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-c2rf2" podUID="8603601a-c2cf-4413-9cb5-1801baafd774" Apr 16 10:07:18.841031 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:18.840986 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86d6efca-ec4a-40d2-a200-6d8dacba5368-metrics-tls\") pod \"dns-default-sh9pl\" (UID: \"86d6efca-ec4a-40d2-a200-6d8dacba5368\") " pod="openshift-dns/dns-default-sh9pl" Apr 16 10:07:18.841031 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:18.841050 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a03f4b8-27ee-4dd2-8150-2b73a73e4f06-cert\") pod \"ingress-canary-xnnjq\" (UID: \"4a03f4b8-27ee-4dd2-8150-2b73a73e4f06\") " pod="openshift-ingress-canary/ingress-canary-xnnjq" Apr 16 10:07:18.843319 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:18.843292 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86d6efca-ec4a-40d2-a200-6d8dacba5368-metrics-tls\") pod \"dns-default-sh9pl\" (UID: \"86d6efca-ec4a-40d2-a200-6d8dacba5368\") " pod="openshift-dns/dns-default-sh9pl" Apr 16 10:07:18.843440 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:18.843421 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a03f4b8-27ee-4dd2-8150-2b73a73e4f06-cert\") pod \"ingress-canary-xnnjq\" (UID: \"4a03f4b8-27ee-4dd2-8150-2b73a73e4f06\") " pod="openshift-ingress-canary/ingress-canary-xnnjq" Apr 16 10:07:18.919654 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:18.919623 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ccsn6\"" Apr 16 10:07:18.927733 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:18.927713 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sh9pl" Apr 16 10:07:19.038792 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:19.038764 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-sh9pl"] Apr 16 10:07:19.042411 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:07:19.042376 2525 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86d6efca_ec4a_40d2_a200_6d8dacba5368.slice/crio-aa89ca7ff9c65c501a7167804fc8378a9a60da244bf9d7e24b15a79a77a821fa WatchSource:0}: Error finding container aa89ca7ff9c65c501a7167804fc8378a9a60da244bf9d7e24b15a79a77a821fa: Status 404 returned error can't find the container with id aa89ca7ff9c65c501a7167804fc8378a9a60da244bf9d7e24b15a79a77a821fa Apr 16 10:07:19.343970 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:19.343939 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-j6qrg\" (UID: \"c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6qrg" Apr 16 10:07:19.346243 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:19.346215 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-j6qrg\" (UID: \"c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6qrg" Apr 16 10:07:19.433363 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:19.433324 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sh9pl" event={"ID":"86d6efca-ec4a-40d2-a200-6d8dacba5368","Type":"ContainerStarted","Data":"aa89ca7ff9c65c501a7167804fc8378a9a60da244bf9d7e24b15a79a77a821fa"} Apr 16 10:07:19.548278 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:19.548234 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6qrg" Apr 16 10:07:19.677943 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:19.677910 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-j6qrg"] Apr 16 10:07:19.681420 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:07:19.681394 2525 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc26fcd6b_d7f0_4fc8_809f_192c8b9eb35c.slice/crio-7552d5e0fc7372e29a7063d111650ef35820204144da6b2eb782bff7cf84f677 WatchSource:0}: Error finding container 7552d5e0fc7372e29a7063d111650ef35820204144da6b2eb782bff7cf84f677: Status 404 returned error can't find the container with id 7552d5e0fc7372e29a7063d111650ef35820204144da6b2eb782bff7cf84f677 Apr 16 10:07:20.438180 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:20.438128 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6qrg" event={"ID":"c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c","Type":"ContainerStarted","Data":"7552d5e0fc7372e29a7063d111650ef35820204144da6b2eb782bff7cf84f677"} Apr 16 10:07:20.439947 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:20.439899 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sh9pl" event={"ID":"86d6efca-ec4a-40d2-a200-6d8dacba5368","Type":"ContainerStarted","Data":"a99a0c150f2d46469c29e22e5364bc179f4db86583da20dbaea1b684195b7af7"} Apr 16 10:07:21.444326 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:21.444290 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sh9pl" event={"ID":"86d6efca-ec4a-40d2-a200-6d8dacba5368","Type":"ContainerStarted","Data":"7895daffebc33e78378da4548ae5630905ed0325b3bbe1ade66e8ce9c5c72859"} Apr 16 10:07:21.444825 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:21.444458 2525 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-sh9pl" Apr 16 10:07:21.460671 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:21.460628 2525 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-sh9pl" podStartSLOduration=130.243004839 podStartE2EDuration="2m11.460613253s" podCreationTimestamp="2026-04-16 10:05:10 +0000 UTC" firstStartedPulling="2026-04-16 10:07:19.044150225 +0000 UTC m=+161.643598158" lastFinishedPulling="2026-04-16 10:07:20.261758635 +0000 UTC m=+162.861206572" observedRunningTime="2026-04-16 10:07:21.459611313 +0000 UTC m=+164.059059282" watchObservedRunningTime="2026-04-16 10:07:21.460613253 +0000 UTC m=+164.060061209" Apr 16 10:07:22.448318 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:22.448282 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6qrg" event={"ID":"c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c","Type":"ContainerStarted","Data":"331d8f17123eb0b851bd4b6ad5989018109648b0ebf0217a323fc415170a1336"} Apr 16 10:07:22.465184 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:22.465133 2525 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-j6qrg" podStartSLOduration=33.630360322 podStartE2EDuration="35.465116998s" podCreationTimestamp="2026-04-16 10:06:47 +0000 UTC" firstStartedPulling="2026-04-16 10:07:19.683577617 +0000 UTC m=+162.283025551" lastFinishedPulling="2026-04-16 10:07:21.518334293 +0000 UTC m=+164.117782227" observedRunningTime="2026-04-16 10:07:22.464333402 +0000 UTC m=+165.063781358" watchObservedRunningTime="2026-04-16 10:07:22.465116998 +0000 UTC m=+165.064564954" Apr 16 10:07:24.015916 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:24.015573 2525 patch_prober.go:28] interesting pod/image-registry-5ccd57b7c8-vc29b container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 10:07:24.015916 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:24.015639 2525 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" podUID="5a082bd3-2d68-4bba-a387-faa89f319dac" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 10:07:24.988864 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:24.988829 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xnnjq" Apr 16 10:07:24.990923 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:24.990904 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-cwtl6\"" Apr 16 10:07:24.999724 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:24.999696 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xnnjq" Apr 16 10:07:25.064406 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:25.064376 2525 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-6xll6"] Apr 16 10:07:25.069913 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:25.069888 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-6xll6" Apr 16 10:07:25.072237 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:25.072215 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 10:07:25.072452 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:25.072264 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 10:07:25.072579 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:25.072332 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-5vzzd\"" Apr 16 10:07:25.078345 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:25.073565 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 10:07:25.084088 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:25.084064 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-6xll6"] Apr 16 10:07:25.092994 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:25.092970 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngzzn\" (UniqueName: \"kubernetes.io/projected/e8c5484b-2c69-48a7-aff1-28b3f4709e90-kube-api-access-ngzzn\") pod \"prometheus-operator-78f957474d-6xll6\" (UID: \"e8c5484b-2c69-48a7-aff1-28b3f4709e90\") " pod="openshift-monitoring/prometheus-operator-78f957474d-6xll6" Apr 16 10:07:25.093106 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:25.093007 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e8c5484b-2c69-48a7-aff1-28b3f4709e90-metrics-client-ca\") pod \"prometheus-operator-78f957474d-6xll6\" (UID: \"e8c5484b-2c69-48a7-aff1-28b3f4709e90\") " pod="openshift-monitoring/prometheus-operator-78f957474d-6xll6" Apr 16 10:07:25.093106 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:25.093074 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e8c5484b-2c69-48a7-aff1-28b3f4709e90-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-6xll6\" (UID: \"e8c5484b-2c69-48a7-aff1-28b3f4709e90\") " pod="openshift-monitoring/prometheus-operator-78f957474d-6xll6" Apr 16 10:07:25.093106 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:25.093095 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/e8c5484b-2c69-48a7-aff1-28b3f4709e90-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-6xll6\" (UID: \"e8c5484b-2c69-48a7-aff1-28b3f4709e90\") " pod="openshift-monitoring/prometheus-operator-78f957474d-6xll6" Apr 16 10:07:25.124187 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:25.124157 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xnnjq"] Apr 16 10:07:25.127383 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:07:25.127356 2525 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a03f4b8_27ee_4dd2_8150_2b73a73e4f06.slice/crio-78bb079beb6d531d36181146500e7acf3b89c44dc482190203451f679c05a6fc WatchSource:0}: Error finding container 78bb079beb6d531d36181146500e7acf3b89c44dc482190203451f679c05a6fc: Status 404 returned error can't find the container with id 78bb079beb6d531d36181146500e7acf3b89c44dc482190203451f679c05a6fc Apr 16 10:07:25.193822 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:25.193780 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e8c5484b-2c69-48a7-aff1-28b3f4709e90-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-6xll6\" (UID: \"e8c5484b-2c69-48a7-aff1-28b3f4709e90\") " pod="openshift-monitoring/prometheus-operator-78f957474d-6xll6" Apr 16 10:07:25.193978 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:25.193833 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/e8c5484b-2c69-48a7-aff1-28b3f4709e90-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-6xll6\" (UID: \"e8c5484b-2c69-48a7-aff1-28b3f4709e90\") " pod="openshift-monitoring/prometheus-operator-78f957474d-6xll6" Apr 16 10:07:25.193978 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:25.193877 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ngzzn\" (UniqueName: \"kubernetes.io/projected/e8c5484b-2c69-48a7-aff1-28b3f4709e90-kube-api-access-ngzzn\") pod \"prometheus-operator-78f957474d-6xll6\" (UID: \"e8c5484b-2c69-48a7-aff1-28b3f4709e90\") " pod="openshift-monitoring/prometheus-operator-78f957474d-6xll6" Apr 16 10:07:25.194089 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:07:25.193972 2525 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 16 10:07:25.194089 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:25.194008 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e8c5484b-2c69-48a7-aff1-28b3f4709e90-metrics-client-ca\") pod \"prometheus-operator-78f957474d-6xll6\" (UID: \"e8c5484b-2c69-48a7-aff1-28b3f4709e90\") " pod="openshift-monitoring/prometheus-operator-78f957474d-6xll6" Apr 16 10:07:25.194089 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:07:25.194042 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8c5484b-2c69-48a7-aff1-28b3f4709e90-prometheus-operator-tls podName:e8c5484b-2c69-48a7-aff1-28b3f4709e90 nodeName:}" failed. No retries permitted until 2026-04-16 10:07:25.694021563 +0000 UTC m=+168.293469498 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/e8c5484b-2c69-48a7-aff1-28b3f4709e90-prometheus-operator-tls") pod "prometheus-operator-78f957474d-6xll6" (UID: "e8c5484b-2c69-48a7-aff1-28b3f4709e90") : secret "prometheus-operator-tls" not found Apr 16 10:07:25.194562 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:25.194546 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e8c5484b-2c69-48a7-aff1-28b3f4709e90-metrics-client-ca\") pod \"prometheus-operator-78f957474d-6xll6\" (UID: \"e8c5484b-2c69-48a7-aff1-28b3f4709e90\") " pod="openshift-monitoring/prometheus-operator-78f957474d-6xll6" Apr 16 10:07:25.196186 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:25.196163 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e8c5484b-2c69-48a7-aff1-28b3f4709e90-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-6xll6\" (UID: \"e8c5484b-2c69-48a7-aff1-28b3f4709e90\") " pod="openshift-monitoring/prometheus-operator-78f957474d-6xll6" Apr 16 10:07:25.202529 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:25.202487 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngzzn\" (UniqueName: \"kubernetes.io/projected/e8c5484b-2c69-48a7-aff1-28b3f4709e90-kube-api-access-ngzzn\") pod \"prometheus-operator-78f957474d-6xll6\" (UID: \"e8c5484b-2c69-48a7-aff1-28b3f4709e90\") " pod="openshift-monitoring/prometheus-operator-78f957474d-6xll6" Apr 16 10:07:25.455897 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:25.455861 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xnnjq" event={"ID":"4a03f4b8-27ee-4dd2-8150-2b73a73e4f06","Type":"ContainerStarted","Data":"78bb079beb6d531d36181146500e7acf3b89c44dc482190203451f679c05a6fc"} Apr 16 10:07:25.698649 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:25.698611 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/e8c5484b-2c69-48a7-aff1-28b3f4709e90-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-6xll6\" (UID: \"e8c5484b-2c69-48a7-aff1-28b3f4709e90\") " pod="openshift-monitoring/prometheus-operator-78f957474d-6xll6" Apr 16 10:07:25.701635 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:25.701605 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/e8c5484b-2c69-48a7-aff1-28b3f4709e90-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-6xll6\" (UID: \"e8c5484b-2c69-48a7-aff1-28b3f4709e90\") " pod="openshift-monitoring/prometheus-operator-78f957474d-6xll6" Apr 16 10:07:25.988042 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:25.988006 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-6xll6" Apr 16 10:07:26.124826 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:26.124471 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-6xll6"] Apr 16 10:07:26.127256 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:07:26.127214 2525 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8c5484b_2c69_48a7_aff1_28b3f4709e90.slice/crio-6da316cb6e86ef606392a6635e5e02bd6cf99bda84a0e6eff8e427c4d5c00dd4 WatchSource:0}: Error finding container 6da316cb6e86ef606392a6635e5e02bd6cf99bda84a0e6eff8e427c4d5c00dd4: Status 404 returned error can't find the container with id 6da316cb6e86ef606392a6635e5e02bd6cf99bda84a0e6eff8e427c4d5c00dd4 Apr 16 10:07:26.459697 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:26.459665 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-6xll6" event={"ID":"e8c5484b-2c69-48a7-aff1-28b3f4709e90","Type":"ContainerStarted","Data":"6da316cb6e86ef606392a6635e5e02bd6cf99bda84a0e6eff8e427c4d5c00dd4"} Apr 16 10:07:27.464195 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:27.464154 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xnnjq" event={"ID":"4a03f4b8-27ee-4dd2-8150-2b73a73e4f06","Type":"ContainerStarted","Data":"93a01a2a5a457a894cb7c7bcfeefef305464ab939f2a248b1e8ac13c85f63878"} Apr 16 10:07:27.479620 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:27.479575 2525 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-xnnjq" podStartSLOduration=135.951283309 podStartE2EDuration="2m17.479556123s" podCreationTimestamp="2026-04-16 10:05:10 +0000 UTC" firstStartedPulling="2026-04-16 10:07:25.129754352 +0000 UTC m=+167.729202289" lastFinishedPulling="2026-04-16 10:07:26.658027161 +0000 UTC m=+169.257475103" observedRunningTime="2026-04-16 10:07:27.478730137 +0000 UTC m=+170.078178093" watchObservedRunningTime="2026-04-16 10:07:27.479556123 +0000 UTC m=+170.079004079" Apr 16 10:07:27.990276 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:27.990176 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mczdx" Apr 16 10:07:28.468538 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:28.468491 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-6xll6" event={"ID":"e8c5484b-2c69-48a7-aff1-28b3f4709e90","Type":"ContainerStarted","Data":"f80a38823aff2a275af5a4acbe36485898b42bac111cee9080fb6d8e35301471"} Apr 16 10:07:28.468998 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:28.468547 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-6xll6" event={"ID":"e8c5484b-2c69-48a7-aff1-28b3f4709e90","Type":"ContainerStarted","Data":"3b3e38a2059a2eaf32b2b933d9a62795409b240e688bb3498c9ee0e555f38ffd"} Apr 16 10:07:28.484912 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:28.484865 2525 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-78f957474d-6xll6" podStartSLOduration=1.901777562 podStartE2EDuration="3.484838144s" podCreationTimestamp="2026-04-16 10:07:25 +0000 UTC" firstStartedPulling="2026-04-16 10:07:26.129405368 +0000 UTC m=+168.728853302" lastFinishedPulling="2026-04-16 10:07:27.712465947 +0000 UTC m=+170.311913884" observedRunningTime="2026-04-16 10:07:28.483095661 +0000 UTC m=+171.082543616" watchObservedRunningTime="2026-04-16 10:07:28.484838144 +0000 UTC m=+171.084286100" Apr 16 10:07:30.473741 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:30.473703 2525 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-67ww5"] Apr 16 10:07:30.477157 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:30.477132 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-67ww5" Apr 16 10:07:30.479625 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:30.479599 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-7m7d8\"" Apr 16 10:07:30.479773 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:30.479600 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 10:07:30.479858 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:30.479610 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 10:07:30.479955 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:30.479911 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 10:07:30.538582 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:30.538547 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/68dcc22a-a6f3-46f9-a654-7e6b830011c3-node-exporter-tls\") pod \"node-exporter-67ww5\" (UID: \"68dcc22a-a6f3-46f9-a654-7e6b830011c3\") " pod="openshift-monitoring/node-exporter-67ww5" Apr 16 10:07:30.538582 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:30.538595 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/68dcc22a-a6f3-46f9-a654-7e6b830011c3-node-exporter-accelerators-collector-config\") pod \"node-exporter-67ww5\" (UID: \"68dcc22a-a6f3-46f9-a654-7e6b830011c3\") " pod="openshift-monitoring/node-exporter-67ww5" Apr 16 10:07:30.538823 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:30.538639 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/68dcc22a-a6f3-46f9-a654-7e6b830011c3-node-exporter-wtmp\") pod \"node-exporter-67ww5\" (UID: \"68dcc22a-a6f3-46f9-a654-7e6b830011c3\") " pod="openshift-monitoring/node-exporter-67ww5" Apr 16 10:07:30.538823 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:30.538664 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/68dcc22a-a6f3-46f9-a654-7e6b830011c3-metrics-client-ca\") pod \"node-exporter-67ww5\" (UID: \"68dcc22a-a6f3-46f9-a654-7e6b830011c3\") " pod="openshift-monitoring/node-exporter-67ww5" Apr 16 10:07:30.538823 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:30.538685 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/68dcc22a-a6f3-46f9-a654-7e6b830011c3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-67ww5\" (UID: \"68dcc22a-a6f3-46f9-a654-7e6b830011c3\") " pod="openshift-monitoring/node-exporter-67ww5" Apr 16 10:07:30.538823 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:30.538725 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/68dcc22a-a6f3-46f9-a654-7e6b830011c3-root\") pod \"node-exporter-67ww5\" (UID: \"68dcc22a-a6f3-46f9-a654-7e6b830011c3\") " pod="openshift-monitoring/node-exporter-67ww5" Apr 16 10:07:30.538823 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:30.538773 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkhdp\" (UniqueName: \"kubernetes.io/projected/68dcc22a-a6f3-46f9-a654-7e6b830011c3-kube-api-access-pkhdp\") pod \"node-exporter-67ww5\" (UID: \"68dcc22a-a6f3-46f9-a654-7e6b830011c3\") " pod="openshift-monitoring/node-exporter-67ww5" Apr 16 10:07:30.538823 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:30.538812 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/68dcc22a-a6f3-46f9-a654-7e6b830011c3-sys\") pod \"node-exporter-67ww5\" (UID: \"68dcc22a-a6f3-46f9-a654-7e6b830011c3\") " pod="openshift-monitoring/node-exporter-67ww5" Apr 16 10:07:30.539109 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:30.538870 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/68dcc22a-a6f3-46f9-a654-7e6b830011c3-node-exporter-textfile\") pod \"node-exporter-67ww5\" (UID: \"68dcc22a-a6f3-46f9-a654-7e6b830011c3\") " pod="openshift-monitoring/node-exporter-67ww5" Apr 16 10:07:30.639825 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:30.639794 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/68dcc22a-a6f3-46f9-a654-7e6b830011c3-metrics-client-ca\") pod \"node-exporter-67ww5\" (UID: \"68dcc22a-a6f3-46f9-a654-7e6b830011c3\") " pod="openshift-monitoring/node-exporter-67ww5" Apr 16 10:07:30.640005 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:30.639830 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/68dcc22a-a6f3-46f9-a654-7e6b830011c3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-67ww5\" (UID: \"68dcc22a-a6f3-46f9-a654-7e6b830011c3\") " pod="openshift-monitoring/node-exporter-67ww5" Apr 16 10:07:30.640005 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:30.639863 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/68dcc22a-a6f3-46f9-a654-7e6b830011c3-root\") pod \"node-exporter-67ww5\" (UID: \"68dcc22a-a6f3-46f9-a654-7e6b830011c3\") " pod="openshift-monitoring/node-exporter-67ww5" Apr 16 10:07:30.640005 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:30.639920 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/68dcc22a-a6f3-46f9-a654-7e6b830011c3-root\") pod \"node-exporter-67ww5\" (UID: \"68dcc22a-a6f3-46f9-a654-7e6b830011c3\") " pod="openshift-monitoring/node-exporter-67ww5" Apr 16 10:07:30.640169 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:30.640034 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pkhdp\" (UniqueName: \"kubernetes.io/projected/68dcc22a-a6f3-46f9-a654-7e6b830011c3-kube-api-access-pkhdp\") pod \"node-exporter-67ww5\" (UID: \"68dcc22a-a6f3-46f9-a654-7e6b830011c3\") " pod="openshift-monitoring/node-exporter-67ww5" Apr 16 10:07:30.640169 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:30.640061 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/68dcc22a-a6f3-46f9-a654-7e6b830011c3-sys\") pod \"node-exporter-67ww5\" (UID: \"68dcc22a-a6f3-46f9-a654-7e6b830011c3\") " pod="openshift-monitoring/node-exporter-67ww5" Apr 16 10:07:30.640169 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:30.640096 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/68dcc22a-a6f3-46f9-a654-7e6b830011c3-node-exporter-textfile\") pod \"node-exporter-67ww5\" (UID: \"68dcc22a-a6f3-46f9-a654-7e6b830011c3\") " pod="openshift-monitoring/node-exporter-67ww5" Apr 16 10:07:30.640169 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:30.640124 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/68dcc22a-a6f3-46f9-a654-7e6b830011c3-node-exporter-tls\") pod \"node-exporter-67ww5\" (UID: \"68dcc22a-a6f3-46f9-a654-7e6b830011c3\") " pod="openshift-monitoring/node-exporter-67ww5" Apr 16 10:07:30.640169 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:30.640147 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/68dcc22a-a6f3-46f9-a654-7e6b830011c3-sys\") pod \"node-exporter-67ww5\" (UID: \"68dcc22a-a6f3-46f9-a654-7e6b830011c3\") " pod="openshift-monitoring/node-exporter-67ww5" Apr 16 10:07:30.640169 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:30.640150 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/68dcc22a-a6f3-46f9-a654-7e6b830011c3-node-exporter-accelerators-collector-config\") pod \"node-exporter-67ww5\" (UID: \"68dcc22a-a6f3-46f9-a654-7e6b830011c3\") " pod="openshift-monitoring/node-exporter-67ww5" Apr 16 10:07:30.640440 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:30.640209 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/68dcc22a-a6f3-46f9-a654-7e6b830011c3-node-exporter-wtmp\") pod \"node-exporter-67ww5\" (UID: \"68dcc22a-a6f3-46f9-a654-7e6b830011c3\") " pod="openshift-monitoring/node-exporter-67ww5" Apr 16 10:07:30.640440 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:30.640348 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/68dcc22a-a6f3-46f9-a654-7e6b830011c3-node-exporter-wtmp\") pod \"node-exporter-67ww5\" (UID: \"68dcc22a-a6f3-46f9-a654-7e6b830011c3\") " pod="openshift-monitoring/node-exporter-67ww5" Apr 16 10:07:30.640440 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:07:30.640426 2525 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 10:07:30.640632 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:07:30.640476 2525 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68dcc22a-a6f3-46f9-a654-7e6b830011c3-node-exporter-tls podName:68dcc22a-a6f3-46f9-a654-7e6b830011c3 nodeName:}" failed. No retries permitted until 2026-04-16 10:07:31.1404585 +0000 UTC m=+173.739906435 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/68dcc22a-a6f3-46f9-a654-7e6b830011c3-node-exporter-tls") pod "node-exporter-67ww5" (UID: "68dcc22a-a6f3-46f9-a654-7e6b830011c3") : secret "node-exporter-tls" not found Apr 16 10:07:30.640708 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:30.640669 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/68dcc22a-a6f3-46f9-a654-7e6b830011c3-metrics-client-ca\") pod \"node-exporter-67ww5\" (UID: \"68dcc22a-a6f3-46f9-a654-7e6b830011c3\") " pod="openshift-monitoring/node-exporter-67ww5" Apr 16 10:07:30.640830 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:30.640801 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/68dcc22a-a6f3-46f9-a654-7e6b830011c3-node-exporter-textfile\") pod \"node-exporter-67ww5\" (UID: \"68dcc22a-a6f3-46f9-a654-7e6b830011c3\") " pod="openshift-monitoring/node-exporter-67ww5" Apr 16 10:07:30.641026 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:30.641010 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/68dcc22a-a6f3-46f9-a654-7e6b830011c3-node-exporter-accelerators-collector-config\") pod \"node-exporter-67ww5\" (UID: \"68dcc22a-a6f3-46f9-a654-7e6b830011c3\") " pod="openshift-monitoring/node-exporter-67ww5" Apr 16 10:07:30.642526 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:30.642489 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/68dcc22a-a6f3-46f9-a654-7e6b830011c3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-67ww5\" (UID: \"68dcc22a-a6f3-46f9-a654-7e6b830011c3\") " pod="openshift-monitoring/node-exporter-67ww5" Apr 16 10:07:30.650140 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:30.650116 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkhdp\" (UniqueName: \"kubernetes.io/projected/68dcc22a-a6f3-46f9-a654-7e6b830011c3-kube-api-access-pkhdp\") pod \"node-exporter-67ww5\" (UID: \"68dcc22a-a6f3-46f9-a654-7e6b830011c3\") " pod="openshift-monitoring/node-exporter-67ww5" Apr 16 10:07:30.988305 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:30.988277 2525 scope.go:117] "RemoveContainer" containerID="160c8c30866dd4393f0fb1643e7e77e5b04f7534fa67b4416575507ba3571632" Apr 16 10:07:31.145797 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:31.145763 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/68dcc22a-a6f3-46f9-a654-7e6b830011c3-node-exporter-tls\") pod \"node-exporter-67ww5\" (UID: \"68dcc22a-a6f3-46f9-a654-7e6b830011c3\") " pod="openshift-monitoring/node-exporter-67ww5" Apr 16 10:07:31.148035 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:31.148011 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/68dcc22a-a6f3-46f9-a654-7e6b830011c3-node-exporter-tls\") pod \"node-exporter-67ww5\" (UID: \"68dcc22a-a6f3-46f9-a654-7e6b830011c3\") " pod="openshift-monitoring/node-exporter-67ww5" Apr 16 10:07:31.387040 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:31.387005 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-67ww5" Apr 16 10:07:31.397300 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:07:31.397250 2525 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68dcc22a_a6f3_46f9_a654_7e6b830011c3.slice/crio-cc410bee4b4c861fd56e47d953db60db2b80c4d521ddc0c6b9eeb1842428243a WatchSource:0}: Error finding container cc410bee4b4c861fd56e47d953db60db2b80c4d521ddc0c6b9eeb1842428243a: Status 404 returned error can't find the container with id cc410bee4b4c861fd56e47d953db60db2b80c4d521ddc0c6b9eeb1842428243a Apr 16 10:07:31.450597 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:31.450566 2525 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-sh9pl" Apr 16 10:07:31.480062 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:31.480028 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-67ww5" event={"ID":"68dcc22a-a6f3-46f9-a654-7e6b830011c3","Type":"ContainerStarted","Data":"cc410bee4b4c861fd56e47d953db60db2b80c4d521ddc0c6b9eeb1842428243a"} Apr 16 10:07:31.482706 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:31.482680 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-c2rf2_8603601a-c2cf-4413-9cb5-1801baafd774/console-operator/2.log" Apr 16 10:07:31.482861 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:31.482751 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-c2rf2" event={"ID":"8603601a-c2cf-4413-9cb5-1801baafd774","Type":"ContainerStarted","Data":"bf1001bb8deb4b65b14d9fe0023358fab31020f49a87c0a57743d80f8e704691"} Apr 16 10:07:31.483788 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:31.483767 2525 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-c2rf2" Apr 16 10:07:31.490711 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:31.490683 2525 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-d87b8d5fc-c2rf2" Apr 16 10:07:31.526911 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:31.526857 2525 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-d87b8d5fc-c2rf2" podStartSLOduration=42.177534789 podStartE2EDuration="44.526838673s" podCreationTimestamp="2026-04-16 10:06:47 +0000 UTC" firstStartedPulling="2026-04-16 10:06:47.978751642 +0000 UTC m=+130.578199577" lastFinishedPulling="2026-04-16 10:06:50.328055524 +0000 UTC m=+132.927503461" observedRunningTime="2026-04-16 10:07:31.503717546 +0000 UTC m=+174.103165502" watchObservedRunningTime="2026-04-16 10:07:31.526838673 +0000 UTC m=+174.126286630" Apr 16 10:07:31.623490 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:31.623457 2525 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-586b57c7b4-gbvlr"] Apr 16 10:07:31.627954 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:31.627931 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-gbvlr" Apr 16 10:07:31.630134 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:31.630107 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 10:07:31.630134 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:31.630122 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-2w4wz\"" Apr 16 10:07:31.630134 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:31.630118 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 10:07:31.637681 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:31.637627 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-gbvlr"] Apr 16 10:07:31.751899 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:31.751866 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r96s\" (UniqueName: \"kubernetes.io/projected/feca609b-cecb-458c-99d4-6f8e74e5ca33-kube-api-access-8r96s\") pod \"downloads-586b57c7b4-gbvlr\" (UID: \"feca609b-cecb-458c-99d4-6f8e74e5ca33\") " pod="openshift-console/downloads-586b57c7b4-gbvlr" Apr 16 10:07:31.853060 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:31.853026 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8r96s\" (UniqueName: \"kubernetes.io/projected/feca609b-cecb-458c-99d4-6f8e74e5ca33-kube-api-access-8r96s\") pod \"downloads-586b57c7b4-gbvlr\" (UID: \"feca609b-cecb-458c-99d4-6f8e74e5ca33\") " pod="openshift-console/downloads-586b57c7b4-gbvlr" Apr 16 10:07:31.862593 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:31.862571 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r96s\" (UniqueName: \"kubernetes.io/projected/feca609b-cecb-458c-99d4-6f8e74e5ca33-kube-api-access-8r96s\") pod \"downloads-586b57c7b4-gbvlr\" (UID: \"feca609b-cecb-458c-99d4-6f8e74e5ca33\") " pod="openshift-console/downloads-586b57c7b4-gbvlr" Apr 16 10:07:31.938665 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:31.938588 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-gbvlr" Apr 16 10:07:32.075115 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:32.075083 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-gbvlr"] Apr 16 10:07:32.078073 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:07:32.078042 2525 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfeca609b_cecb_458c_99d4_6f8e74e5ca33.slice/crio-9eff81200b2e24a36b15a38ca85ffaf85cd4b76361c606eb2984df16004482ac WatchSource:0}: Error finding container 9eff81200b2e24a36b15a38ca85ffaf85cd4b76361c606eb2984df16004482ac: Status 404 returned error can't find the container with id 9eff81200b2e24a36b15a38ca85ffaf85cd4b76361c606eb2984df16004482ac Apr 16 10:07:32.488833 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:32.488803 2525 generic.go:358] "Generic (PLEG): container finished" podID="68dcc22a-a6f3-46f9-a654-7e6b830011c3" containerID="0b8b01036e0fa299ae21f4dfc3ae0661001498420c648e2852845335fe22b2d2" exitCode=0 Apr 16 10:07:32.489220 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:32.488887 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-67ww5" event={"ID":"68dcc22a-a6f3-46f9-a654-7e6b830011c3","Type":"ContainerDied","Data":"0b8b01036e0fa299ae21f4dfc3ae0661001498420c648e2852845335fe22b2d2"} Apr 16 10:07:32.490020 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:32.490000 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-gbvlr" event={"ID":"feca609b-cecb-458c-99d4-6f8e74e5ca33","Type":"ContainerStarted","Data":"9eff81200b2e24a36b15a38ca85ffaf85cd4b76361c606eb2984df16004482ac"} Apr 16 10:07:33.495596 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:33.495558 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-67ww5" event={"ID":"68dcc22a-a6f3-46f9-a654-7e6b830011c3","Type":"ContainerStarted","Data":"54a1db1df42231e2c97588c77de779d7dbf59a38ebb578a71c2de15f4417ac45"} Apr 16 10:07:33.496069 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:33.495640 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-67ww5" event={"ID":"68dcc22a-a6f3-46f9-a654-7e6b830011c3","Type":"ContainerStarted","Data":"640aec0195b7b9ba5f0c28cd4c0d2728e371e1170907f2506b44c6a5386a81cf"} Apr 16 10:07:33.516109 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:33.516047 2525 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-67ww5" podStartSLOduration=2.626877248 podStartE2EDuration="3.516026495s" podCreationTimestamp="2026-04-16 10:07:30 +0000 UTC" firstStartedPulling="2026-04-16 10:07:31.399420023 +0000 UTC m=+173.998867957" lastFinishedPulling="2026-04-16 10:07:32.288569265 +0000 UTC m=+174.888017204" observedRunningTime="2026-04-16 10:07:33.51439819 +0000 UTC m=+176.113846145" watchObservedRunningTime="2026-04-16 10:07:33.516026495 +0000 UTC m=+176.115474452" Apr 16 10:07:34.015228 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.015200 2525 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" Apr 16 10:07:34.420011 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.419972 2525 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-8ddddd4-lc5g8"] Apr 16 10:07:34.423871 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.423842 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-8ddddd4-lc5g8" Apr 16 10:07:34.426569 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.426307 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 10:07:34.426569 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.426360 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-8cl44\"" Apr 16 10:07:34.426569 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.426370 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 10:07:34.426941 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.426655 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 10:07:34.426941 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.426687 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 10:07:34.426941 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.426694 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 10:07:34.427175 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.427156 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-a49ch01v8m9qv\"" Apr 16 10:07:34.433952 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.433928 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-8ddddd4-lc5g8"] Apr 16 10:07:34.478065 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.478028 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/2deb8c49-4b7e-40e9-b0a9-a6db1e950268-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-8ddddd4-lc5g8\" (UID: \"2deb8c49-4b7e-40e9-b0a9-a6db1e950268\") " pod="openshift-monitoring/thanos-querier-8ddddd4-lc5g8" Apr 16 10:07:34.478238 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.478079 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/2deb8c49-4b7e-40e9-b0a9-a6db1e950268-secret-thanos-querier-tls\") pod \"thanos-querier-8ddddd4-lc5g8\" (UID: \"2deb8c49-4b7e-40e9-b0a9-a6db1e950268\") " pod="openshift-monitoring/thanos-querier-8ddddd4-lc5g8" Apr 16 10:07:34.478238 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.478107 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2deb8c49-4b7e-40e9-b0a9-a6db1e950268-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-8ddddd4-lc5g8\" (UID: \"2deb8c49-4b7e-40e9-b0a9-a6db1e950268\") " pod="openshift-monitoring/thanos-querier-8ddddd4-lc5g8" Apr 16 10:07:34.478238 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.478201 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2deb8c49-4b7e-40e9-b0a9-a6db1e950268-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-8ddddd4-lc5g8\" (UID: \"2deb8c49-4b7e-40e9-b0a9-a6db1e950268\") " pod="openshift-monitoring/thanos-querier-8ddddd4-lc5g8" Apr 16 10:07:34.478444 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.478240 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2deb8c49-4b7e-40e9-b0a9-a6db1e950268-metrics-client-ca\") pod \"thanos-querier-8ddddd4-lc5g8\" (UID: \"2deb8c49-4b7e-40e9-b0a9-a6db1e950268\") " pod="openshift-monitoring/thanos-querier-8ddddd4-lc5g8" Apr 16 10:07:34.478444 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.478298 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/2deb8c49-4b7e-40e9-b0a9-a6db1e950268-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-8ddddd4-lc5g8\" (UID: \"2deb8c49-4b7e-40e9-b0a9-a6db1e950268\") " pod="openshift-monitoring/thanos-querier-8ddddd4-lc5g8" Apr 16 10:07:34.478444 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.478344 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2deb8c49-4b7e-40e9-b0a9-a6db1e950268-secret-grpc-tls\") pod \"thanos-querier-8ddddd4-lc5g8\" (UID: \"2deb8c49-4b7e-40e9-b0a9-a6db1e950268\") " pod="openshift-monitoring/thanos-querier-8ddddd4-lc5g8" Apr 16 10:07:34.478444 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.478381 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gqbm\" (UniqueName: \"kubernetes.io/projected/2deb8c49-4b7e-40e9-b0a9-a6db1e950268-kube-api-access-4gqbm\") pod \"thanos-querier-8ddddd4-lc5g8\" (UID: \"2deb8c49-4b7e-40e9-b0a9-a6db1e950268\") " pod="openshift-monitoring/thanos-querier-8ddddd4-lc5g8" Apr 16 10:07:34.579276 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.579241 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/2deb8c49-4b7e-40e9-b0a9-a6db1e950268-secret-thanos-querier-tls\") pod \"thanos-querier-8ddddd4-lc5g8\" (UID: \"2deb8c49-4b7e-40e9-b0a9-a6db1e950268\") " pod="openshift-monitoring/thanos-querier-8ddddd4-lc5g8" Apr 16 10:07:34.579276 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.579282 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2deb8c49-4b7e-40e9-b0a9-a6db1e950268-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-8ddddd4-lc5g8\" (UID: \"2deb8c49-4b7e-40e9-b0a9-a6db1e950268\") " pod="openshift-monitoring/thanos-querier-8ddddd4-lc5g8" Apr 16 10:07:34.579764 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.579307 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2deb8c49-4b7e-40e9-b0a9-a6db1e950268-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-8ddddd4-lc5g8\" (UID: \"2deb8c49-4b7e-40e9-b0a9-a6db1e950268\") " pod="openshift-monitoring/thanos-querier-8ddddd4-lc5g8" Apr 16 10:07:34.579764 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.579329 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2deb8c49-4b7e-40e9-b0a9-a6db1e950268-metrics-client-ca\") pod \"thanos-querier-8ddddd4-lc5g8\" (UID: \"2deb8c49-4b7e-40e9-b0a9-a6db1e950268\") " pod="openshift-monitoring/thanos-querier-8ddddd4-lc5g8" Apr 16 10:07:34.579764 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.579652 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/2deb8c49-4b7e-40e9-b0a9-a6db1e950268-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-8ddddd4-lc5g8\" (UID: \"2deb8c49-4b7e-40e9-b0a9-a6db1e950268\") " pod="openshift-monitoring/thanos-querier-8ddddd4-lc5g8" Apr 16 10:07:34.579764 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.579738 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2deb8c49-4b7e-40e9-b0a9-a6db1e950268-secret-grpc-tls\") pod \"thanos-querier-8ddddd4-lc5g8\" (UID: \"2deb8c49-4b7e-40e9-b0a9-a6db1e950268\") " pod="openshift-monitoring/thanos-querier-8ddddd4-lc5g8" Apr 16 10:07:34.579969 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.579772 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gqbm\" (UniqueName: \"kubernetes.io/projected/2deb8c49-4b7e-40e9-b0a9-a6db1e950268-kube-api-access-4gqbm\") pod \"thanos-querier-8ddddd4-lc5g8\" (UID: \"2deb8c49-4b7e-40e9-b0a9-a6db1e950268\") " pod="openshift-monitoring/thanos-querier-8ddddd4-lc5g8" Apr 16 10:07:34.579969 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.579852 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/2deb8c49-4b7e-40e9-b0a9-a6db1e950268-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-8ddddd4-lc5g8\" (UID: \"2deb8c49-4b7e-40e9-b0a9-a6db1e950268\") " pod="openshift-monitoring/thanos-querier-8ddddd4-lc5g8" Apr 16 10:07:34.580175 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.580106 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2deb8c49-4b7e-40e9-b0a9-a6db1e950268-metrics-client-ca\") pod \"thanos-querier-8ddddd4-lc5g8\" (UID: \"2deb8c49-4b7e-40e9-b0a9-a6db1e950268\") " pod="openshift-monitoring/thanos-querier-8ddddd4-lc5g8" Apr 16 10:07:34.582442 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.582398 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2deb8c49-4b7e-40e9-b0a9-a6db1e950268-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-8ddddd4-lc5g8\" (UID: \"2deb8c49-4b7e-40e9-b0a9-a6db1e950268\") " pod="openshift-monitoring/thanos-querier-8ddddd4-lc5g8" Apr 16 10:07:34.582581 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.582443 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2deb8c49-4b7e-40e9-b0a9-a6db1e950268-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-8ddddd4-lc5g8\" (UID: \"2deb8c49-4b7e-40e9-b0a9-a6db1e950268\") " pod="openshift-monitoring/thanos-querier-8ddddd4-lc5g8" Apr 16 10:07:34.582975 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.582927 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/2deb8c49-4b7e-40e9-b0a9-a6db1e950268-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-8ddddd4-lc5g8\" (UID: \"2deb8c49-4b7e-40e9-b0a9-a6db1e950268\") " pod="openshift-monitoring/thanos-querier-8ddddd4-lc5g8" Apr 16 10:07:34.582975 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.582946 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/2deb8c49-4b7e-40e9-b0a9-a6db1e950268-secret-thanos-querier-tls\") pod \"thanos-querier-8ddddd4-lc5g8\" (UID: \"2deb8c49-4b7e-40e9-b0a9-a6db1e950268\") " pod="openshift-monitoring/thanos-querier-8ddddd4-lc5g8" Apr 16 10:07:34.583135 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.582980 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/2deb8c49-4b7e-40e9-b0a9-a6db1e950268-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-8ddddd4-lc5g8\" (UID: \"2deb8c49-4b7e-40e9-b0a9-a6db1e950268\") " pod="openshift-monitoring/thanos-querier-8ddddd4-lc5g8" Apr 16 10:07:34.583135 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.583067 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2deb8c49-4b7e-40e9-b0a9-a6db1e950268-secret-grpc-tls\") pod \"thanos-querier-8ddddd4-lc5g8\" (UID: \"2deb8c49-4b7e-40e9-b0a9-a6db1e950268\") " pod="openshift-monitoring/thanos-querier-8ddddd4-lc5g8" Apr 16 10:07:34.589254 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.589229 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gqbm\" (UniqueName: \"kubernetes.io/projected/2deb8c49-4b7e-40e9-b0a9-a6db1e950268-kube-api-access-4gqbm\") pod \"thanos-querier-8ddddd4-lc5g8\" (UID: \"2deb8c49-4b7e-40e9-b0a9-a6db1e950268\") " pod="openshift-monitoring/thanos-querier-8ddddd4-lc5g8" Apr 16 10:07:34.736995 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.736909 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-8ddddd4-lc5g8" Apr 16 10:07:34.850253 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.850220 2525 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-688967fbc4-7cgfh"] Apr 16 10:07:34.855284 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.855260 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-688967fbc4-7cgfh" Apr 16 10:07:34.857935 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.857808 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 10:07:34.857935 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.857852 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 10:07:34.857935 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.857894 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 10:07:34.857935 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.857903 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-qmc4v\"" Apr 16 10:07:34.858209 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.858068 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 10:07:34.858209 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.858198 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-29lar7v90ba6b\"" Apr 16 10:07:34.861498 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.861134 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-688967fbc4-7cgfh"] Apr 16 10:07:34.881531 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.881493 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-8ddddd4-lc5g8"] Apr 16 10:07:34.885036 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:07:34.885005 2525 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2deb8c49_4b7e_40e9_b0a9_a6db1e950268.slice/crio-01ff3eab4f8bc9af5302b27514a0b49178088c3131bf34bee707a543bdf7ca56 WatchSource:0}: Error finding container 01ff3eab4f8bc9af5302b27514a0b49178088c3131bf34bee707a543bdf7ca56: Status 404 returned error can't find the container with id 01ff3eab4f8bc9af5302b27514a0b49178088c3131bf34bee707a543bdf7ca56 Apr 16 10:07:34.984378 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.984341 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2f4245b-c7fb-4515-bae2-db6271349435-client-ca-bundle\") pod \"metrics-server-688967fbc4-7cgfh\" (UID: \"c2f4245b-c7fb-4515-bae2-db6271349435\") " pod="openshift-monitoring/metrics-server-688967fbc4-7cgfh" Apr 16 10:07:34.984573 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.984383 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/c2f4245b-c7fb-4515-bae2-db6271349435-secret-metrics-server-client-certs\") pod \"metrics-server-688967fbc4-7cgfh\" (UID: \"c2f4245b-c7fb-4515-bae2-db6271349435\") " pod="openshift-monitoring/metrics-server-688967fbc4-7cgfh" Apr 16 10:07:34.984573 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.984433 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/c2f4245b-c7fb-4515-bae2-db6271349435-metrics-server-audit-profiles\") pod \"metrics-server-688967fbc4-7cgfh\" (UID: \"c2f4245b-c7fb-4515-bae2-db6271349435\") " pod="openshift-monitoring/metrics-server-688967fbc4-7cgfh" Apr 16 10:07:34.984573 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.984463 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2f4245b-c7fb-4515-bae2-db6271349435-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-688967fbc4-7cgfh\" (UID: \"c2f4245b-c7fb-4515-bae2-db6271349435\") " pod="openshift-monitoring/metrics-server-688967fbc4-7cgfh" Apr 16 10:07:34.984573 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.984500 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwpx9\" (UniqueName: \"kubernetes.io/projected/c2f4245b-c7fb-4515-bae2-db6271349435-kube-api-access-jwpx9\") pod \"metrics-server-688967fbc4-7cgfh\" (UID: \"c2f4245b-c7fb-4515-bae2-db6271349435\") " pod="openshift-monitoring/metrics-server-688967fbc4-7cgfh" Apr 16 10:07:34.984771 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.984581 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/c2f4245b-c7fb-4515-bae2-db6271349435-audit-log\") pod \"metrics-server-688967fbc4-7cgfh\" (UID: \"c2f4245b-c7fb-4515-bae2-db6271349435\") " pod="openshift-monitoring/metrics-server-688967fbc4-7cgfh" Apr 16 10:07:34.984771 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:34.984679 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/c2f4245b-c7fb-4515-bae2-db6271349435-secret-metrics-server-tls\") pod \"metrics-server-688967fbc4-7cgfh\" (UID: \"c2f4245b-c7fb-4515-bae2-db6271349435\") " pod="openshift-monitoring/metrics-server-688967fbc4-7cgfh" Apr 16 10:07:35.085078 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:35.085043 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwpx9\" (UniqueName: \"kubernetes.io/projected/c2f4245b-c7fb-4515-bae2-db6271349435-kube-api-access-jwpx9\") pod \"metrics-server-688967fbc4-7cgfh\" (UID: \"c2f4245b-c7fb-4515-bae2-db6271349435\") " pod="openshift-monitoring/metrics-server-688967fbc4-7cgfh" Apr 16 10:07:35.085255 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:35.085091 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/c2f4245b-c7fb-4515-bae2-db6271349435-audit-log\") pod \"metrics-server-688967fbc4-7cgfh\" (UID: \"c2f4245b-c7fb-4515-bae2-db6271349435\") " pod="openshift-monitoring/metrics-server-688967fbc4-7cgfh" Apr 16 10:07:35.085255 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:35.085147 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/c2f4245b-c7fb-4515-bae2-db6271349435-secret-metrics-server-tls\") pod \"metrics-server-688967fbc4-7cgfh\" (UID: \"c2f4245b-c7fb-4515-bae2-db6271349435\") " pod="openshift-monitoring/metrics-server-688967fbc4-7cgfh" Apr 16 10:07:35.085255 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:35.085201 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2f4245b-c7fb-4515-bae2-db6271349435-client-ca-bundle\") pod \"metrics-server-688967fbc4-7cgfh\" (UID: \"c2f4245b-c7fb-4515-bae2-db6271349435\") " pod="openshift-monitoring/metrics-server-688967fbc4-7cgfh" Apr 16 10:07:35.085255 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:35.085229 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/c2f4245b-c7fb-4515-bae2-db6271349435-secret-metrics-server-client-certs\") pod \"metrics-server-688967fbc4-7cgfh\" (UID: \"c2f4245b-c7fb-4515-bae2-db6271349435\") " pod="openshift-monitoring/metrics-server-688967fbc4-7cgfh" Apr 16 10:07:35.085474 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:35.085451 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/c2f4245b-c7fb-4515-bae2-db6271349435-metrics-server-audit-profiles\") pod \"metrics-server-688967fbc4-7cgfh\" (UID: \"c2f4245b-c7fb-4515-bae2-db6271349435\") " pod="openshift-monitoring/metrics-server-688967fbc4-7cgfh" Apr 16 10:07:35.085556 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:35.085537 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2f4245b-c7fb-4515-bae2-db6271349435-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-688967fbc4-7cgfh\" (UID: \"c2f4245b-c7fb-4515-bae2-db6271349435\") " pod="openshift-monitoring/metrics-server-688967fbc4-7cgfh" Apr 16 10:07:35.085679 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:35.085635 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/c2f4245b-c7fb-4515-bae2-db6271349435-audit-log\") pod \"metrics-server-688967fbc4-7cgfh\" (UID: \"c2f4245b-c7fb-4515-bae2-db6271349435\") " pod="openshift-monitoring/metrics-server-688967fbc4-7cgfh" Apr 16 10:07:35.086278 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:35.086248 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2f4245b-c7fb-4515-bae2-db6271349435-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-688967fbc4-7cgfh\" (UID: \"c2f4245b-c7fb-4515-bae2-db6271349435\") " pod="openshift-monitoring/metrics-server-688967fbc4-7cgfh" Apr 16 10:07:35.086570 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:35.086549 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/c2f4245b-c7fb-4515-bae2-db6271349435-metrics-server-audit-profiles\") pod \"metrics-server-688967fbc4-7cgfh\" (UID: \"c2f4245b-c7fb-4515-bae2-db6271349435\") " pod="openshift-monitoring/metrics-server-688967fbc4-7cgfh" Apr 16 10:07:35.088084 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:35.088063 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/c2f4245b-c7fb-4515-bae2-db6271349435-secret-metrics-server-client-certs\") pod \"metrics-server-688967fbc4-7cgfh\" (UID: \"c2f4245b-c7fb-4515-bae2-db6271349435\") " pod="openshift-monitoring/metrics-server-688967fbc4-7cgfh" Apr 16 10:07:35.088084 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:35.088079 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/c2f4245b-c7fb-4515-bae2-db6271349435-secret-metrics-server-tls\") pod \"metrics-server-688967fbc4-7cgfh\" (UID: \"c2f4245b-c7fb-4515-bae2-db6271349435\") " pod="openshift-monitoring/metrics-server-688967fbc4-7cgfh" Apr 16 10:07:35.088226 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:35.088092 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2f4245b-c7fb-4515-bae2-db6271349435-client-ca-bundle\") pod \"metrics-server-688967fbc4-7cgfh\" (UID: \"c2f4245b-c7fb-4515-bae2-db6271349435\") " pod="openshift-monitoring/metrics-server-688967fbc4-7cgfh" Apr 16 10:07:35.095919 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:35.095897 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwpx9\" (UniqueName: \"kubernetes.io/projected/c2f4245b-c7fb-4515-bae2-db6271349435-kube-api-access-jwpx9\") pod \"metrics-server-688967fbc4-7cgfh\" (UID: \"c2f4245b-c7fb-4515-bae2-db6271349435\") " pod="openshift-monitoring/metrics-server-688967fbc4-7cgfh" Apr 16 10:07:35.168125 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:35.167707 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-688967fbc4-7cgfh" Apr 16 10:07:35.326668 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:35.326631 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-688967fbc4-7cgfh"] Apr 16 10:07:35.330270 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:07:35.330239 2525 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2f4245b_c7fb_4515_bae2_db6271349435.slice/crio-f371284c4f0728b5cc6105a217428c0359932bbcf112154b15d81ad7a4d16b3f WatchSource:0}: Error finding container f371284c4f0728b5cc6105a217428c0359932bbcf112154b15d81ad7a4d16b3f: Status 404 returned error can't find the container with id f371284c4f0728b5cc6105a217428c0359932bbcf112154b15d81ad7a4d16b3f Apr 16 10:07:35.503024 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:35.502936 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-688967fbc4-7cgfh" event={"ID":"c2f4245b-c7fb-4515-bae2-db6271349435","Type":"ContainerStarted","Data":"f371284c4f0728b5cc6105a217428c0359932bbcf112154b15d81ad7a4d16b3f"} Apr 16 10:07:35.504171 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:35.504137 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8ddddd4-lc5g8" event={"ID":"2deb8c49-4b7e-40e9-b0a9-a6db1e950268","Type":"ContainerStarted","Data":"01ff3eab4f8bc9af5302b27514a0b49178088c3131bf34bee707a543bdf7ca56"} Apr 16 10:07:36.730764 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.730725 2525 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 10:07:36.735995 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.735961 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.741542 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.738925 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 10:07:36.741542 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.739156 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 10:07:36.741542 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.739351 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 10:07:36.741542 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.739554 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 10:07:36.741542 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.739729 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 10:07:36.741542 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.739940 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 10:07:36.741542 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.740115 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 10:07:36.741542 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.740772 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-3m7ce9s2mumde\"" Apr 16 10:07:36.744743 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.740818 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 10:07:36.744743 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.740818 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 10:07:36.744743 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.740850 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 10:07:36.747754 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.747657 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 10:07:36.749108 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.749086 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-n8w89\"" Apr 16 10:07:36.749108 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.749112 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 10:07:36.750142 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.750031 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 10:07:36.807643 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.806838 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0a18c918-39f6-4be5-b45e-3afec94f3a51-config-out\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.807643 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.806889 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0a18c918-39f6-4be5-b45e-3afec94f3a51-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.807643 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.806918 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.807643 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.806946 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.807643 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.806981 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0a18c918-39f6-4be5-b45e-3afec94f3a51-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.807643 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.807023 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0a18c918-39f6-4be5-b45e-3afec94f3a51-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.807643 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.807055 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a18c918-39f6-4be5-b45e-3afec94f3a51-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.807643 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.807086 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.807643 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.807141 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a18c918-39f6-4be5-b45e-3afec94f3a51-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.807643 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.807200 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.807643 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.807225 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-web-config\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.807643 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.807263 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-config\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.807643 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.807284 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a18c918-39f6-4be5-b45e-3afec94f3a51-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.807643 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.807306 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0a18c918-39f6-4be5-b45e-3afec94f3a51-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.807643 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.807328 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmnhz\" (UniqueName: \"kubernetes.io/projected/0a18c918-39f6-4be5-b45e-3afec94f3a51-kube-api-access-dmnhz\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.807643 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.807358 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.808552 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.807381 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.808552 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.807403 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.908862 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.908809 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a18c918-39f6-4be5-b45e-3afec94f3a51-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.909054 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.908882 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.909054 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.908913 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-web-config\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.909054 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.908951 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-config\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.909054 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.908977 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a18c918-39f6-4be5-b45e-3afec94f3a51-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.909054 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.909002 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0a18c918-39f6-4be5-b45e-3afec94f3a51-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.909054 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.909025 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dmnhz\" (UniqueName: \"kubernetes.io/projected/0a18c918-39f6-4be5-b45e-3afec94f3a51-kube-api-access-dmnhz\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.909482 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.909450 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.909595 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.909525 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.909595 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.909555 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.909694 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.909619 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0a18c918-39f6-4be5-b45e-3afec94f3a51-config-out\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.909694 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.909654 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0a18c918-39f6-4be5-b45e-3afec94f3a51-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.909694 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.909677 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.909834 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.909703 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.909834 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.909744 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0a18c918-39f6-4be5-b45e-3afec94f3a51-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.909834 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.909777 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0a18c918-39f6-4be5-b45e-3afec94f3a51-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.909834 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.909807 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a18c918-39f6-4be5-b45e-3afec94f3a51-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.909834 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.909808 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a18c918-39f6-4be5-b45e-3afec94f3a51-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.910069 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.909847 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.910069 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.909888 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a18c918-39f6-4be5-b45e-3afec94f3a51-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.910193 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.910161 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0a18c918-39f6-4be5-b45e-3afec94f3a51-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.913906 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.913769 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0a18c918-39f6-4be5-b45e-3afec94f3a51-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.914395 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.914373 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.914503 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.914471 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.915347 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.914964 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-web-config\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.915347 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.915059 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0a18c918-39f6-4be5-b45e-3afec94f3a51-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.915347 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.915256 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-config\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.915347 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.915246 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.916190 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.916144 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a18c918-39f6-4be5-b45e-3afec94f3a51-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.917074 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.916465 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.917373 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.917330 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0a18c918-39f6-4be5-b45e-3afec94f3a51-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.918180 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.918135 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.918629 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.918608 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0a18c918-39f6-4be5-b45e-3afec94f3a51-config-out\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.918804 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.918781 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.919631 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.919609 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:36.943642 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:36.943577 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmnhz\" (UniqueName: \"kubernetes.io/projected/0a18c918-39f6-4be5-b45e-3afec94f3a51-kube-api-access-dmnhz\") pod \"prometheus-k8s-0\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:37.053664 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:37.053615 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:37.957819 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:37.957781 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 10:07:37.964867 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:07:37.964836 2525 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a18c918_39f6_4be5_b45e_3afec94f3a51.slice/crio-9f71373d7d25cae73633bca75caf9c00da67779f00a44fdc6cbf91718a5a1f6d WatchSource:0}: Error finding container 9f71373d7d25cae73633bca75caf9c00da67779f00a44fdc6cbf91718a5a1f6d: Status 404 returned error can't find the container with id 9f71373d7d25cae73633bca75caf9c00da67779f00a44fdc6cbf91718a5a1f6d Apr 16 10:07:38.520256 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:38.520200 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-688967fbc4-7cgfh" event={"ID":"c2f4245b-c7fb-4515-bae2-db6271349435","Type":"ContainerStarted","Data":"eecf0e0538350b20111dd2b29e04aaf3923d8315fb5e7ed6f04b008858732d2d"} Apr 16 10:07:38.524328 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:38.524045 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8ddddd4-lc5g8" event={"ID":"2deb8c49-4b7e-40e9-b0a9-a6db1e950268","Type":"ContainerStarted","Data":"de70d276e6d5aa5f32341a02f3ade198b79ee39e2e40d18467afd7b5be848073"} Apr 16 10:07:38.524328 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:38.524085 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8ddddd4-lc5g8" event={"ID":"2deb8c49-4b7e-40e9-b0a9-a6db1e950268","Type":"ContainerStarted","Data":"9867e75cb29918a4544d6a04fdef2c09f1b33fda75b36b508e3ad17f4c309f52"} Apr 16 10:07:38.524328 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:38.524101 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8ddddd4-lc5g8" event={"ID":"2deb8c49-4b7e-40e9-b0a9-a6db1e950268","Type":"ContainerStarted","Data":"e30a391ff5198a90146db9f4908fadb949545108b67c2c354545fba747cfbe27"} Apr 16 10:07:38.530212 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:38.530183 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a18c918-39f6-4be5-b45e-3afec94f3a51","Type":"ContainerStarted","Data":"9f71373d7d25cae73633bca75caf9c00da67779f00a44fdc6cbf91718a5a1f6d"} Apr 16 10:07:38.540743 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:38.540674 2525 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-688967fbc4-7cgfh" podStartSLOduration=2.088639763 podStartE2EDuration="4.540660633s" podCreationTimestamp="2026-04-16 10:07:34 +0000 UTC" firstStartedPulling="2026-04-16 10:07:35.332724966 +0000 UTC m=+177.932172904" lastFinishedPulling="2026-04-16 10:07:37.784745825 +0000 UTC m=+180.384193774" observedRunningTime="2026-04-16 10:07:38.539475449 +0000 UTC m=+181.138923407" watchObservedRunningTime="2026-04-16 10:07:38.540660633 +0000 UTC m=+181.140108646" Apr 16 10:07:39.028703 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.028654 2525 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" podUID="5a082bd3-2d68-4bba-a387-faa89f319dac" containerName="registry" containerID="cri-o://6a50f8ca0c1a1e17e32c9bdfa14b81dc5b8c4672ef1d0a05aa88b888c4934786" gracePeriod=30 Apr 16 10:07:39.453370 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.453345 2525 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" Apr 16 10:07:39.534185 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.534115 2525 generic.go:358] "Generic (PLEG): container finished" podID="5a082bd3-2d68-4bba-a387-faa89f319dac" containerID="6a50f8ca0c1a1e17e32c9bdfa14b81dc5b8c4672ef1d0a05aa88b888c4934786" exitCode=0 Apr 16 10:07:39.534328 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.534199 2525 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" Apr 16 10:07:39.534328 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.534209 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" event={"ID":"5a082bd3-2d68-4bba-a387-faa89f319dac","Type":"ContainerDied","Data":"6a50f8ca0c1a1e17e32c9bdfa14b81dc5b8c4672ef1d0a05aa88b888c4934786"} Apr 16 10:07:39.534328 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.534258 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5ccd57b7c8-vc29b" event={"ID":"5a082bd3-2d68-4bba-a387-faa89f319dac","Type":"ContainerDied","Data":"45d7dbe60348ba1237d1c147dcd0b4620224470606647cf4fa7fcc00e9bf9991"} Apr 16 10:07:39.534328 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.534281 2525 scope.go:117] "RemoveContainer" containerID="6a50f8ca0c1a1e17e32c9bdfa14b81dc5b8c4672ef1d0a05aa88b888c4934786" Apr 16 10:07:39.536406 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.536386 2525 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5w4pl\" (UniqueName: \"kubernetes.io/projected/5a082bd3-2d68-4bba-a387-faa89f319dac-kube-api-access-5w4pl\") pod \"5a082bd3-2d68-4bba-a387-faa89f319dac\" (UID: \"5a082bd3-2d68-4bba-a387-faa89f319dac\") " Apr 16 10:07:39.536545 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.536435 2525 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5a082bd3-2d68-4bba-a387-faa89f319dac-registry-tls\") pod \"5a082bd3-2d68-4bba-a387-faa89f319dac\" (UID: \"5a082bd3-2d68-4bba-a387-faa89f319dac\") " Apr 16 10:07:39.536545 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.536474 2525 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5a082bd3-2d68-4bba-a387-faa89f319dac-image-registry-private-configuration\") pod \"5a082bd3-2d68-4bba-a387-faa89f319dac\" (UID: \"5a082bd3-2d68-4bba-a387-faa89f319dac\") " Apr 16 10:07:39.536545 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.536539 2525 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5a082bd3-2d68-4bba-a387-faa89f319dac-bound-sa-token\") pod \"5a082bd3-2d68-4bba-a387-faa89f319dac\" (UID: \"5a082bd3-2d68-4bba-a387-faa89f319dac\") " Apr 16 10:07:39.536707 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.536577 2525 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5a082bd3-2d68-4bba-a387-faa89f319dac-trusted-ca\") pod \"5a082bd3-2d68-4bba-a387-faa89f319dac\" (UID: \"5a082bd3-2d68-4bba-a387-faa89f319dac\") " Apr 16 10:07:39.536707 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.536627 2525 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5a082bd3-2d68-4bba-a387-faa89f319dac-installation-pull-secrets\") pod \"5a082bd3-2d68-4bba-a387-faa89f319dac\" (UID: \"5a082bd3-2d68-4bba-a387-faa89f319dac\") " Apr 16 10:07:39.536707 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.536688 2525 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5a082bd3-2d68-4bba-a387-faa89f319dac-ca-trust-extracted\") pod \"5a082bd3-2d68-4bba-a387-faa89f319dac\" (UID: \"5a082bd3-2d68-4bba-a387-faa89f319dac\") " Apr 16 10:07:39.536849 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.536714 2525 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5a082bd3-2d68-4bba-a387-faa89f319dac-registry-certificates\") pod \"5a082bd3-2d68-4bba-a387-faa89f319dac\" (UID: \"5a082bd3-2d68-4bba-a387-faa89f319dac\") " Apr 16 10:07:39.537301 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.537242 2525 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a082bd3-2d68-4bba-a387-faa89f319dac-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "5a082bd3-2d68-4bba-a387-faa89f319dac" (UID: "5a082bd3-2d68-4bba-a387-faa89f319dac"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 10:07:39.537732 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.537687 2525 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a082bd3-2d68-4bba-a387-faa89f319dac-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "5a082bd3-2d68-4bba-a387-faa89f319dac" (UID: "5a082bd3-2d68-4bba-a387-faa89f319dac"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 10:07:39.538652 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.538480 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8ddddd4-lc5g8" event={"ID":"2deb8c49-4b7e-40e9-b0a9-a6db1e950268","Type":"ContainerStarted","Data":"ec92feb163ffb58b3c79688cd8297cb6b55251da35bc4bf1c3311bfac794cac6"} Apr 16 10:07:39.538652 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.538534 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8ddddd4-lc5g8" event={"ID":"2deb8c49-4b7e-40e9-b0a9-a6db1e950268","Type":"ContainerStarted","Data":"e572cb0722ecd03cfd0dc09215e56326a076d70a2ccde86c96d24a62edb9e76b"} Apr 16 10:07:39.539996 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.539946 2525 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a082bd3-2d68-4bba-a387-faa89f319dac-kube-api-access-5w4pl" (OuterVolumeSpecName: "kube-api-access-5w4pl") pod "5a082bd3-2d68-4bba-a387-faa89f319dac" (UID: "5a082bd3-2d68-4bba-a387-faa89f319dac"). InnerVolumeSpecName "kube-api-access-5w4pl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 10:07:39.542189 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.540730 2525 generic.go:358] "Generic (PLEG): container finished" podID="0a18c918-39f6-4be5-b45e-3afec94f3a51" containerID="ada6fee44efee4fc6e71f1c6b196591d2b07b6d88910653f6c0d463c4f9db464" exitCode=0 Apr 16 10:07:39.542189 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.541269 2525 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a082bd3-2d68-4bba-a387-faa89f319dac-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "5a082bd3-2d68-4bba-a387-faa89f319dac" (UID: "5a082bd3-2d68-4bba-a387-faa89f319dac"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 10:07:39.542189 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.541389 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a18c918-39f6-4be5-b45e-3afec94f3a51","Type":"ContainerDied","Data":"ada6fee44efee4fc6e71f1c6b196591d2b07b6d88910653f6c0d463c4f9db464"} Apr 16 10:07:39.542947 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.542697 2525 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a082bd3-2d68-4bba-a387-faa89f319dac-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "5a082bd3-2d68-4bba-a387-faa89f319dac" (UID: "5a082bd3-2d68-4bba-a387-faa89f319dac"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 10:07:39.543633 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.543597 2525 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a082bd3-2d68-4bba-a387-faa89f319dac-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "5a082bd3-2d68-4bba-a387-faa89f319dac" (UID: "5a082bd3-2d68-4bba-a387-faa89f319dac"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 10:07:39.544055 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.544023 2525 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a082bd3-2d68-4bba-a387-faa89f319dac-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "5a082bd3-2d68-4bba-a387-faa89f319dac" (UID: "5a082bd3-2d68-4bba-a387-faa89f319dac"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 10:07:39.546070 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.546051 2525 scope.go:117] "RemoveContainer" containerID="6a50f8ca0c1a1e17e32c9bdfa14b81dc5b8c4672ef1d0a05aa88b888c4934786" Apr 16 10:07:39.546359 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:07:39.546335 2525 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a50f8ca0c1a1e17e32c9bdfa14b81dc5b8c4672ef1d0a05aa88b888c4934786\": container with ID starting with 6a50f8ca0c1a1e17e32c9bdfa14b81dc5b8c4672ef1d0a05aa88b888c4934786 not found: ID does not exist" containerID="6a50f8ca0c1a1e17e32c9bdfa14b81dc5b8c4672ef1d0a05aa88b888c4934786" Apr 16 10:07:39.546437 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.546369 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a50f8ca0c1a1e17e32c9bdfa14b81dc5b8c4672ef1d0a05aa88b888c4934786"} err="failed to get container status \"6a50f8ca0c1a1e17e32c9bdfa14b81dc5b8c4672ef1d0a05aa88b888c4934786\": rpc error: code = NotFound desc = could not find container \"6a50f8ca0c1a1e17e32c9bdfa14b81dc5b8c4672ef1d0a05aa88b888c4934786\": container with ID starting with 6a50f8ca0c1a1e17e32c9bdfa14b81dc5b8c4672ef1d0a05aa88b888c4934786 not found: ID does not exist" Apr 16 10:07:39.548882 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.548852 2525 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a082bd3-2d68-4bba-a387-faa89f319dac-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "5a082bd3-2d68-4bba-a387-faa89f319dac" (UID: "5a082bd3-2d68-4bba-a387-faa89f319dac"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 10:07:39.638921 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.638857 2525 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5a082bd3-2d68-4bba-a387-faa89f319dac-bound-sa-token\") on node \"ip-10-0-135-1.ec2.internal\" DevicePath \"\"" Apr 16 10:07:39.638921 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.638888 2525 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5a082bd3-2d68-4bba-a387-faa89f319dac-trusted-ca\") on node \"ip-10-0-135-1.ec2.internal\" DevicePath \"\"" Apr 16 10:07:39.638921 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.638910 2525 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5a082bd3-2d68-4bba-a387-faa89f319dac-installation-pull-secrets\") on node \"ip-10-0-135-1.ec2.internal\" DevicePath \"\"" Apr 16 10:07:39.638921 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.638928 2525 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5a082bd3-2d68-4bba-a387-faa89f319dac-ca-trust-extracted\") on node \"ip-10-0-135-1.ec2.internal\" DevicePath \"\"" Apr 16 10:07:39.639275 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.638939 2525 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5a082bd3-2d68-4bba-a387-faa89f319dac-registry-certificates\") on node \"ip-10-0-135-1.ec2.internal\" DevicePath \"\"" Apr 16 10:07:39.639275 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.638949 2525 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5w4pl\" (UniqueName: \"kubernetes.io/projected/5a082bd3-2d68-4bba-a387-faa89f319dac-kube-api-access-5w4pl\") on node \"ip-10-0-135-1.ec2.internal\" DevicePath \"\"" Apr 16 10:07:39.639275 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.638959 2525 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5a082bd3-2d68-4bba-a387-faa89f319dac-registry-tls\") on node \"ip-10-0-135-1.ec2.internal\" DevicePath \"\"" Apr 16 10:07:39.639275 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.638973 2525 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5a082bd3-2d68-4bba-a387-faa89f319dac-image-registry-private-configuration\") on node \"ip-10-0-135-1.ec2.internal\" DevicePath \"\"" Apr 16 10:07:39.856905 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.856875 2525 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5ccd57b7c8-vc29b"] Apr 16 10:07:39.861159 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.861130 2525 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5ccd57b7c8-vc29b"] Apr 16 10:07:39.993212 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:39.993176 2525 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a082bd3-2d68-4bba-a387-faa89f319dac" path="/var/lib/kubelet/pods/5a082bd3-2d68-4bba-a387-faa89f319dac/volumes" Apr 16 10:07:40.548504 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:40.548402 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8ddddd4-lc5g8" event={"ID":"2deb8c49-4b7e-40e9-b0a9-a6db1e950268","Type":"ContainerStarted","Data":"a612497448f188afe185f38c56b8f1662e185e8541eb1df4231f0127ef2af90a"} Apr 16 10:07:40.548978 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:40.548693 2525 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-8ddddd4-lc5g8" Apr 16 10:07:40.573295 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:40.573224 2525 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-8ddddd4-lc5g8" podStartSLOduration=2.2123371450000002 podStartE2EDuration="6.573204336s" podCreationTimestamp="2026-04-16 10:07:34 +0000 UTC" firstStartedPulling="2026-04-16 10:07:34.887172088 +0000 UTC m=+177.486620022" lastFinishedPulling="2026-04-16 10:07:39.248039272 +0000 UTC m=+181.847487213" observedRunningTime="2026-04-16 10:07:40.571937322 +0000 UTC m=+183.171385278" watchObservedRunningTime="2026-04-16 10:07:40.573204336 +0000 UTC m=+183.172652294" Apr 16 10:07:46.562137 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:46.562107 2525 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-8ddddd4-lc5g8" Apr 16 10:07:50.583617 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:50.583373 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a18c918-39f6-4be5-b45e-3afec94f3a51","Type":"ContainerStarted","Data":"316d0f27d94ceec4880669d2b7ca393a427ef28be1424c9e648121f9ddbd0668"} Apr 16 10:07:50.583617 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:50.583417 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a18c918-39f6-4be5-b45e-3afec94f3a51","Type":"ContainerStarted","Data":"cb5a56fd45e97188495ba37bb1513cb79b081f29d7cec3b8fc46ad74c34d88fd"} Apr 16 10:07:50.583617 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:50.583434 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a18c918-39f6-4be5-b45e-3afec94f3a51","Type":"ContainerStarted","Data":"339b3400c0bb06fc0f2a8a62789f3be51a870d030411e418ce53555355b625f3"} Apr 16 10:07:50.583617 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:50.583444 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a18c918-39f6-4be5-b45e-3afec94f3a51","Type":"ContainerStarted","Data":"26972e6d6d9db1ac164809b0c8939963009a24fcefe32ce6795b533d6adeba44"} Apr 16 10:07:50.583617 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:50.583458 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a18c918-39f6-4be5-b45e-3afec94f3a51","Type":"ContainerStarted","Data":"58744e35cbc812c58be77dbb7900fc9a05be23296696c55acf0bf042c64b3143"} Apr 16 10:07:50.583617 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:50.583470 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a18c918-39f6-4be5-b45e-3afec94f3a51","Type":"ContainerStarted","Data":"a8cc558e88fb2c9e121de9ee564ebba42858615191103129ec6019daac751097"} Apr 16 10:07:50.585090 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:50.585061 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-gbvlr" event={"ID":"feca609b-cecb-458c-99d4-6f8e74e5ca33","Type":"ContainerStarted","Data":"4e9845ffa6cce7c0ccb34fd3f04ba5c0b054ed7ff3de8673da8de1dab739ba42"} Apr 16 10:07:50.585443 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:50.585419 2525 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-586b57c7b4-gbvlr" Apr 16 10:07:50.603963 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:50.603929 2525 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-586b57c7b4-gbvlr" Apr 16 10:07:50.613366 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:50.613309 2525 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.77525951 podStartE2EDuration="14.613293475s" podCreationTimestamp="2026-04-16 10:07:36 +0000 UTC" firstStartedPulling="2026-04-16 10:07:37.967011059 +0000 UTC m=+180.566458997" lastFinishedPulling="2026-04-16 10:07:49.80504502 +0000 UTC m=+192.404492962" observedRunningTime="2026-04-16 10:07:50.61114899 +0000 UTC m=+193.210596949" watchObservedRunningTime="2026-04-16 10:07:50.613293475 +0000 UTC m=+193.212741430" Apr 16 10:07:50.628248 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:50.628194 2525 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-586b57c7b4-gbvlr" podStartSLOduration=1.853184813 podStartE2EDuration="19.62817495s" podCreationTimestamp="2026-04-16 10:07:31 +0000 UTC" firstStartedPulling="2026-04-16 10:07:32.080370945 +0000 UTC m=+174.679818878" lastFinishedPulling="2026-04-16 10:07:49.855361078 +0000 UTC m=+192.454809015" observedRunningTime="2026-04-16 10:07:50.627225468 +0000 UTC m=+193.226673427" watchObservedRunningTime="2026-04-16 10:07:50.62817495 +0000 UTC m=+193.227622906" Apr 16 10:07:52.054066 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:52.054025 2525 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:07:55.168497 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:55.168463 2525 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-688967fbc4-7cgfh" Apr 16 10:07:55.169425 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:55.169395 2525 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-688967fbc4-7cgfh" Apr 16 10:07:56.606566 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:56.606500 2525 generic.go:358] "Generic (PLEG): container finished" podID="e96bf9a9-c840-4751-89c6-13968641abc6" containerID="16321ee4d81de3ab50a0358dd308c7ea1c32bc7e53f9ae0f2abe20e4d5b5d1dc" exitCode=0 Apr 16 10:07:56.607094 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:56.606575 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vnkxh" event={"ID":"e96bf9a9-c840-4751-89c6-13968641abc6","Type":"ContainerDied","Data":"16321ee4d81de3ab50a0358dd308c7ea1c32bc7e53f9ae0f2abe20e4d5b5d1dc"} Apr 16 10:07:56.607094 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:56.606957 2525 scope.go:117] "RemoveContainer" containerID="16321ee4d81de3ab50a0358dd308c7ea1c32bc7e53f9ae0f2abe20e4d5b5d1dc" Apr 16 10:07:57.612324 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:07:57.612286 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vnkxh" event={"ID":"e96bf9a9-c840-4751-89c6-13968641abc6","Type":"ContainerStarted","Data":"d552e6e5b0639f9783c004e9d24124b44c71af93a2821090a9b2cc314d8320f5"} Apr 16 10:08:06.903641 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:06.903615 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-xnnjq_4a03f4b8-27ee-4dd2-8150-2b73a73e4f06/serve-healthcheck-canary/0.log" Apr 16 10:08:15.173035 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:15.172956 2525 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-688967fbc4-7cgfh" Apr 16 10:08:15.176855 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:15.176835 2525 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-688967fbc4-7cgfh" Apr 16 10:08:37.054035 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:37.053997 2525 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:37.074179 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:37.074151 2525 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:37.743361 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:37.743336 2525 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:49.866450 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:49.866392 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4739af65-2ca2-4e8f-9b0e-ddeac76a9b66-metrics-certs\") pod \"network-metrics-daemon-mczdx\" (UID: \"4739af65-2ca2-4e8f-9b0e-ddeac76a9b66\") " pod="openshift-multus/network-metrics-daemon-mczdx" Apr 16 10:08:49.868769 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:49.868745 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4739af65-2ca2-4e8f-9b0e-ddeac76a9b66-metrics-certs\") pod \"network-metrics-daemon-mczdx\" (UID: \"4739af65-2ca2-4e8f-9b0e-ddeac76a9b66\") " pod="openshift-multus/network-metrics-daemon-mczdx" Apr 16 10:08:49.893636 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:49.893611 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-btvgh\"" Apr 16 10:08:49.901798 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:49.901778 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mczdx" Apr 16 10:08:50.024761 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:50.024738 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mczdx"] Apr 16 10:08:50.027579 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:08:50.027550 2525 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4739af65_2ca2_4e8f_9b0e_ddeac76a9b66.slice/crio-cabf762083ff51f06d0787792f36f2c9cf129171fb69133c852a0e7cd36e8b21 WatchSource:0}: Error finding container cabf762083ff51f06d0787792f36f2c9cf129171fb69133c852a0e7cd36e8b21: Status 404 returned error can't find the container with id cabf762083ff51f06d0787792f36f2c9cf129171fb69133c852a0e7cd36e8b21 Apr 16 10:08:50.767393 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:50.767350 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mczdx" event={"ID":"4739af65-2ca2-4e8f-9b0e-ddeac76a9b66","Type":"ContainerStarted","Data":"cabf762083ff51f06d0787792f36f2c9cf129171fb69133c852a0e7cd36e8b21"} Apr 16 10:08:51.772141 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:51.772105 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mczdx" event={"ID":"4739af65-2ca2-4e8f-9b0e-ddeac76a9b66","Type":"ContainerStarted","Data":"d9df8d6d7b8ebc4feac28008c77281ffe879366b342cce6131c09c9f7bef4518"} Apr 16 10:08:51.772531 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:51.772146 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mczdx" event={"ID":"4739af65-2ca2-4e8f-9b0e-ddeac76a9b66","Type":"ContainerStarted","Data":"09d0c572b0f0726846a251e22abf24e1f31c270159c076ff6c1f3fbd661d50ef"} Apr 16 10:08:51.790685 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:51.790628 2525 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-mczdx" podStartSLOduration=252.869241322 podStartE2EDuration="4m13.790609569s" podCreationTimestamp="2026-04-16 10:04:38 +0000 UTC" firstStartedPulling="2026-04-16 10:08:50.029824561 +0000 UTC m=+252.629272495" lastFinishedPulling="2026-04-16 10:08:50.951192808 +0000 UTC m=+253.550640742" observedRunningTime="2026-04-16 10:08:51.789201504 +0000 UTC m=+254.388649460" watchObservedRunningTime="2026-04-16 10:08:51.790609569 +0000 UTC m=+254.390057513" Apr 16 10:08:55.097902 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.097868 2525 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 10:08:55.098453 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.098285 2525 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0a18c918-39f6-4be5-b45e-3afec94f3a51" containerName="prometheus" containerID="cri-o://a8cc558e88fb2c9e121de9ee564ebba42858615191103129ec6019daac751097" gracePeriod=600 Apr 16 10:08:55.098453 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.098316 2525 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0a18c918-39f6-4be5-b45e-3afec94f3a51" containerName="kube-rbac-proxy" containerID="cri-o://cb5a56fd45e97188495ba37bb1513cb79b081f29d7cec3b8fc46ad74c34d88fd" gracePeriod=600 Apr 16 10:08:55.098453 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.098325 2525 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0a18c918-39f6-4be5-b45e-3afec94f3a51" containerName="thanos-sidecar" containerID="cri-o://26972e6d6d9db1ac164809b0c8939963009a24fcefe32ce6795b533d6adeba44" gracePeriod=600 Apr 16 10:08:55.098453 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.098339 2525 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0a18c918-39f6-4be5-b45e-3afec94f3a51" containerName="kube-rbac-proxy-web" containerID="cri-o://339b3400c0bb06fc0f2a8a62789f3be51a870d030411e418ce53555355b625f3" gracePeriod=600 Apr 16 10:08:55.098453 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.098369 2525 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0a18c918-39f6-4be5-b45e-3afec94f3a51" containerName="config-reloader" containerID="cri-o://58744e35cbc812c58be77dbb7900fc9a05be23296696c55acf0bf042c64b3143" gracePeriod=600 Apr 16 10:08:55.098453 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.098406 2525 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0a18c918-39f6-4be5-b45e-3afec94f3a51" containerName="kube-rbac-proxy-thanos" containerID="cri-o://316d0f27d94ceec4880669d2b7ca393a427ef28be1424c9e648121f9ddbd0668" gracePeriod=600 Apr 16 10:08:55.354357 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.354289 2525 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:55.413851 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.413813 2525 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-secret-prometheus-k8s-tls\") pod \"0a18c918-39f6-4be5-b45e-3afec94f3a51\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " Apr 16 10:08:55.414020 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.413880 2525 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"0a18c918-39f6-4be5-b45e-3afec94f3a51\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " Apr 16 10:08:55.414020 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.413922 2525 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-config\") pod \"0a18c918-39f6-4be5-b45e-3afec94f3a51\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " Apr 16 10:08:55.414020 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.413952 2525 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0a18c918-39f6-4be5-b45e-3afec94f3a51-config-out\") pod \"0a18c918-39f6-4be5-b45e-3afec94f3a51\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " Apr 16 10:08:55.414020 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.413974 2525 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-web-config\") pod \"0a18c918-39f6-4be5-b45e-3afec94f3a51\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " Apr 16 10:08:55.414020 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.414009 2525 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a18c918-39f6-4be5-b45e-3afec94f3a51-configmap-serving-certs-ca-bundle\") pod \"0a18c918-39f6-4be5-b45e-3afec94f3a51\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " Apr 16 10:08:55.414365 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.414035 2525 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a18c918-39f6-4be5-b45e-3afec94f3a51-configmap-kubelet-serving-ca-bundle\") pod \"0a18c918-39f6-4be5-b45e-3afec94f3a51\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " Apr 16 10:08:55.414365 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.414069 2525 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0a18c918-39f6-4be5-b45e-3afec94f3a51-tls-assets\") pod \"0a18c918-39f6-4be5-b45e-3afec94f3a51\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " Apr 16 10:08:55.414365 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.414098 2525 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0a18c918-39f6-4be5-b45e-3afec94f3a51-prometheus-k8s-rulefiles-0\") pod \"0a18c918-39f6-4be5-b45e-3afec94f3a51\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " Apr 16 10:08:55.414365 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.414126 2525 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a18c918-39f6-4be5-b45e-3afec94f3a51-prometheus-trusted-ca-bundle\") pod \"0a18c918-39f6-4be5-b45e-3afec94f3a51\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " Apr 16 10:08:55.414365 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.414152 2525 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"0a18c918-39f6-4be5-b45e-3afec94f3a51\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " Apr 16 10:08:55.414365 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.414192 2525 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0a18c918-39f6-4be5-b45e-3afec94f3a51-configmap-metrics-client-ca\") pod \"0a18c918-39f6-4be5-b45e-3afec94f3a51\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " Apr 16 10:08:55.414365 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.414228 2525 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmnhz\" (UniqueName: \"kubernetes.io/projected/0a18c918-39f6-4be5-b45e-3afec94f3a51-kube-api-access-dmnhz\") pod \"0a18c918-39f6-4be5-b45e-3afec94f3a51\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " Apr 16 10:08:55.414365 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.414266 2525 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-secret-metrics-client-certs\") pod \"0a18c918-39f6-4be5-b45e-3afec94f3a51\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " Apr 16 10:08:55.414365 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.414293 2525 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-secret-kube-rbac-proxy\") pod \"0a18c918-39f6-4be5-b45e-3afec94f3a51\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " Apr 16 10:08:55.414365 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.414318 2525 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0a18c918-39f6-4be5-b45e-3afec94f3a51-prometheus-k8s-db\") pod \"0a18c918-39f6-4be5-b45e-3afec94f3a51\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " Apr 16 10:08:55.414365 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.414345 2525 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-secret-grpc-tls\") pod \"0a18c918-39f6-4be5-b45e-3afec94f3a51\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " Apr 16 10:08:55.414945 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.414380 2525 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-thanos-prometheus-http-client-file\") pod \"0a18c918-39f6-4be5-b45e-3afec94f3a51\" (UID: \"0a18c918-39f6-4be5-b45e-3afec94f3a51\") " Apr 16 10:08:55.414945 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.414547 2525 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a18c918-39f6-4be5-b45e-3afec94f3a51-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "0a18c918-39f6-4be5-b45e-3afec94f3a51" (UID: "0a18c918-39f6-4be5-b45e-3afec94f3a51"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 10:08:55.414945 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.414695 2525 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a18c918-39f6-4be5-b45e-3afec94f3a51-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-135-1.ec2.internal\" DevicePath \"\"" Apr 16 10:08:55.415089 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.414978 2525 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a18c918-39f6-4be5-b45e-3afec94f3a51-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "0a18c918-39f6-4be5-b45e-3afec94f3a51" (UID: "0a18c918-39f6-4be5-b45e-3afec94f3a51"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 10:08:55.415267 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.415243 2525 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a18c918-39f6-4be5-b45e-3afec94f3a51-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "0a18c918-39f6-4be5-b45e-3afec94f3a51" (UID: "0a18c918-39f6-4be5-b45e-3afec94f3a51"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 10:08:55.417214 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.417185 2525 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a18c918-39f6-4be5-b45e-3afec94f3a51-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "0a18c918-39f6-4be5-b45e-3afec94f3a51" (UID: "0a18c918-39f6-4be5-b45e-3afec94f3a51"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 10:08:55.417465 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.417438 2525 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a18c918-39f6-4be5-b45e-3afec94f3a51-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "0a18c918-39f6-4be5-b45e-3afec94f3a51" (UID: "0a18c918-39f6-4be5-b45e-3afec94f3a51"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 10:08:55.417465 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.417448 2525 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a18c918-39f6-4be5-b45e-3afec94f3a51-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "0a18c918-39f6-4be5-b45e-3afec94f3a51" (UID: "0a18c918-39f6-4be5-b45e-3afec94f3a51"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 10:08:55.419163 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.419124 2525 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "0a18c918-39f6-4be5-b45e-3afec94f3a51" (UID: "0a18c918-39f6-4be5-b45e-3afec94f3a51"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 10:08:55.421630 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.421604 2525 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "0a18c918-39f6-4be5-b45e-3afec94f3a51" (UID: "0a18c918-39f6-4be5-b45e-3afec94f3a51"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 10:08:55.421752 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.421643 2525 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "0a18c918-39f6-4be5-b45e-3afec94f3a51" (UID: "0a18c918-39f6-4be5-b45e-3afec94f3a51"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 10:08:55.421836 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.421696 2525 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a18c918-39f6-4be5-b45e-3afec94f3a51-config-out" (OuterVolumeSpecName: "config-out") pod "0a18c918-39f6-4be5-b45e-3afec94f3a51" (UID: "0a18c918-39f6-4be5-b45e-3afec94f3a51"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 10:08:55.421914 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.421754 2525 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "0a18c918-39f6-4be5-b45e-3afec94f3a51" (UID: "0a18c918-39f6-4be5-b45e-3afec94f3a51"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 10:08:55.422003 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.421972 2525 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-config" (OuterVolumeSpecName: "config") pod "0a18c918-39f6-4be5-b45e-3afec94f3a51" (UID: "0a18c918-39f6-4be5-b45e-3afec94f3a51"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 10:08:55.422064 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.422004 2525 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "0a18c918-39f6-4be5-b45e-3afec94f3a51" (UID: "0a18c918-39f6-4be5-b45e-3afec94f3a51"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 10:08:55.422064 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.422009 2525 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "0a18c918-39f6-4be5-b45e-3afec94f3a51" (UID: "0a18c918-39f6-4be5-b45e-3afec94f3a51"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 10:08:55.422186 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.422163 2525 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a18c918-39f6-4be5-b45e-3afec94f3a51-kube-api-access-dmnhz" (OuterVolumeSpecName: "kube-api-access-dmnhz") pod "0a18c918-39f6-4be5-b45e-3afec94f3a51" (UID: "0a18c918-39f6-4be5-b45e-3afec94f3a51"). InnerVolumeSpecName "kube-api-access-dmnhz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 10:08:55.422274 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.422250 2525 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a18c918-39f6-4be5-b45e-3afec94f3a51-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "0a18c918-39f6-4be5-b45e-3afec94f3a51" (UID: "0a18c918-39f6-4be5-b45e-3afec94f3a51"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 10:08:55.422377 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.422284 2525 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "0a18c918-39f6-4be5-b45e-3afec94f3a51" (UID: "0a18c918-39f6-4be5-b45e-3afec94f3a51"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 10:08:55.434335 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.434311 2525 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-web-config" (OuterVolumeSpecName: "web-config") pod "0a18c918-39f6-4be5-b45e-3afec94f3a51" (UID: "0a18c918-39f6-4be5-b45e-3afec94f3a51"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 10:08:55.515555 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.515504 2525 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-config\") on node \"ip-10-0-135-1.ec2.internal\" DevicePath \"\"" Apr 16 10:08:55.515555 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.515553 2525 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0a18c918-39f6-4be5-b45e-3afec94f3a51-config-out\") on node \"ip-10-0-135-1.ec2.internal\" DevicePath \"\"" Apr 16 10:08:55.515738 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.515566 2525 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-web-config\") on node \"ip-10-0-135-1.ec2.internal\" DevicePath \"\"" Apr 16 10:08:55.515738 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.515575 2525 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a18c918-39f6-4be5-b45e-3afec94f3a51-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-135-1.ec2.internal\" DevicePath \"\"" Apr 16 10:08:55.515738 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.515585 2525 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0a18c918-39f6-4be5-b45e-3afec94f3a51-tls-assets\") on node \"ip-10-0-135-1.ec2.internal\" DevicePath \"\"" Apr 16 10:08:55.515738 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.515594 2525 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0a18c918-39f6-4be5-b45e-3afec94f3a51-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-135-1.ec2.internal\" DevicePath \"\"" Apr 16 10:08:55.515738 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.515603 2525 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a18c918-39f6-4be5-b45e-3afec94f3a51-prometheus-trusted-ca-bundle\") on node \"ip-10-0-135-1.ec2.internal\" DevicePath \"\"" Apr 16 10:08:55.515738 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.515612 2525 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-135-1.ec2.internal\" DevicePath \"\"" Apr 16 10:08:55.515738 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.515622 2525 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0a18c918-39f6-4be5-b45e-3afec94f3a51-configmap-metrics-client-ca\") on node \"ip-10-0-135-1.ec2.internal\" DevicePath \"\"" Apr 16 10:08:55.515738 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.515631 2525 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dmnhz\" (UniqueName: \"kubernetes.io/projected/0a18c918-39f6-4be5-b45e-3afec94f3a51-kube-api-access-dmnhz\") on node \"ip-10-0-135-1.ec2.internal\" DevicePath \"\"" Apr 16 10:08:55.515738 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.515640 2525 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-secret-metrics-client-certs\") on node \"ip-10-0-135-1.ec2.internal\" DevicePath \"\"" Apr 16 10:08:55.515738 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.515649 2525 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-secret-kube-rbac-proxy\") on node \"ip-10-0-135-1.ec2.internal\" DevicePath \"\"" Apr 16 10:08:55.515738 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.515657 2525 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0a18c918-39f6-4be5-b45e-3afec94f3a51-prometheus-k8s-db\") on node \"ip-10-0-135-1.ec2.internal\" DevicePath \"\"" Apr 16 10:08:55.515738 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.515665 2525 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-secret-grpc-tls\") on node \"ip-10-0-135-1.ec2.internal\" DevicePath \"\"" Apr 16 10:08:55.515738 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.515675 2525 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-thanos-prometheus-http-client-file\") on node \"ip-10-0-135-1.ec2.internal\" DevicePath \"\"" Apr 16 10:08:55.515738 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.515684 2525 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-secret-prometheus-k8s-tls\") on node \"ip-10-0-135-1.ec2.internal\" DevicePath \"\"" Apr 16 10:08:55.515738 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.515693 2525 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0a18c918-39f6-4be5-b45e-3afec94f3a51-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-135-1.ec2.internal\" DevicePath \"\"" Apr 16 10:08:55.787092 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.787016 2525 generic.go:358] "Generic (PLEG): container finished" podID="0a18c918-39f6-4be5-b45e-3afec94f3a51" containerID="316d0f27d94ceec4880669d2b7ca393a427ef28be1424c9e648121f9ddbd0668" exitCode=0 Apr 16 10:08:55.787092 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.787042 2525 generic.go:358] "Generic (PLEG): container finished" podID="0a18c918-39f6-4be5-b45e-3afec94f3a51" containerID="cb5a56fd45e97188495ba37bb1513cb79b081f29d7cec3b8fc46ad74c34d88fd" exitCode=0 Apr 16 10:08:55.787092 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.787050 2525 generic.go:358] "Generic (PLEG): container finished" podID="0a18c918-39f6-4be5-b45e-3afec94f3a51" containerID="339b3400c0bb06fc0f2a8a62789f3be51a870d030411e418ce53555355b625f3" exitCode=0 Apr 16 10:08:55.787092 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.787056 2525 generic.go:358] "Generic (PLEG): container finished" podID="0a18c918-39f6-4be5-b45e-3afec94f3a51" containerID="26972e6d6d9db1ac164809b0c8939963009a24fcefe32ce6795b533d6adeba44" exitCode=0 Apr 16 10:08:55.787092 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.787061 2525 generic.go:358] "Generic (PLEG): container finished" podID="0a18c918-39f6-4be5-b45e-3afec94f3a51" containerID="58744e35cbc812c58be77dbb7900fc9a05be23296696c55acf0bf042c64b3143" exitCode=0 Apr 16 10:08:55.787092 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.787066 2525 generic.go:358] "Generic (PLEG): container finished" podID="0a18c918-39f6-4be5-b45e-3afec94f3a51" containerID="a8cc558e88fb2c9e121de9ee564ebba42858615191103129ec6019daac751097" exitCode=0 Apr 16 10:08:55.787394 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.787140 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a18c918-39f6-4be5-b45e-3afec94f3a51","Type":"ContainerDied","Data":"316d0f27d94ceec4880669d2b7ca393a427ef28be1424c9e648121f9ddbd0668"} Apr 16 10:08:55.787394 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.787165 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a18c918-39f6-4be5-b45e-3afec94f3a51","Type":"ContainerDied","Data":"cb5a56fd45e97188495ba37bb1513cb79b081f29d7cec3b8fc46ad74c34d88fd"} Apr 16 10:08:55.787394 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.787178 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a18c918-39f6-4be5-b45e-3afec94f3a51","Type":"ContainerDied","Data":"339b3400c0bb06fc0f2a8a62789f3be51a870d030411e418ce53555355b625f3"} Apr 16 10:08:55.787394 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.787180 2525 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:55.787394 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.787195 2525 scope.go:117] "RemoveContainer" containerID="316d0f27d94ceec4880669d2b7ca393a427ef28be1424c9e648121f9ddbd0668" Apr 16 10:08:55.787394 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.787187 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a18c918-39f6-4be5-b45e-3afec94f3a51","Type":"ContainerDied","Data":"26972e6d6d9db1ac164809b0c8939963009a24fcefe32ce6795b533d6adeba44"} Apr 16 10:08:55.787394 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.787325 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a18c918-39f6-4be5-b45e-3afec94f3a51","Type":"ContainerDied","Data":"58744e35cbc812c58be77dbb7900fc9a05be23296696c55acf0bf042c64b3143"} Apr 16 10:08:55.787394 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.787356 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a18c918-39f6-4be5-b45e-3afec94f3a51","Type":"ContainerDied","Data":"a8cc558e88fb2c9e121de9ee564ebba42858615191103129ec6019daac751097"} Apr 16 10:08:55.787394 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.787372 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a18c918-39f6-4be5-b45e-3afec94f3a51","Type":"ContainerDied","Data":"9f71373d7d25cae73633bca75caf9c00da67779f00a44fdc6cbf91718a5a1f6d"} Apr 16 10:08:55.794551 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.794501 2525 scope.go:117] "RemoveContainer" containerID="cb5a56fd45e97188495ba37bb1513cb79b081f29d7cec3b8fc46ad74c34d88fd" Apr 16 10:08:55.801207 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.801190 2525 scope.go:117] "RemoveContainer" containerID="339b3400c0bb06fc0f2a8a62789f3be51a870d030411e418ce53555355b625f3" Apr 16 10:08:55.807456 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.807440 2525 scope.go:117] "RemoveContainer" containerID="26972e6d6d9db1ac164809b0c8939963009a24fcefe32ce6795b533d6adeba44" Apr 16 10:08:55.811639 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.811616 2525 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 10:08:55.814476 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.814462 2525 scope.go:117] "RemoveContainer" containerID="58744e35cbc812c58be77dbb7900fc9a05be23296696c55acf0bf042c64b3143" Apr 16 10:08:55.816863 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.816839 2525 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 10:08:55.821770 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.821755 2525 scope.go:117] "RemoveContainer" containerID="a8cc558e88fb2c9e121de9ee564ebba42858615191103129ec6019daac751097" Apr 16 10:08:55.828388 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.828374 2525 scope.go:117] "RemoveContainer" containerID="ada6fee44efee4fc6e71f1c6b196591d2b07b6d88910653f6c0d463c4f9db464" Apr 16 10:08:55.834271 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.834258 2525 scope.go:117] "RemoveContainer" containerID="316d0f27d94ceec4880669d2b7ca393a427ef28be1424c9e648121f9ddbd0668" Apr 16 10:08:55.834548 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:08:55.834503 2525 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"316d0f27d94ceec4880669d2b7ca393a427ef28be1424c9e648121f9ddbd0668\": container with ID starting with 316d0f27d94ceec4880669d2b7ca393a427ef28be1424c9e648121f9ddbd0668 not found: ID does not exist" containerID="316d0f27d94ceec4880669d2b7ca393a427ef28be1424c9e648121f9ddbd0668" Apr 16 10:08:55.834650 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.834554 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"316d0f27d94ceec4880669d2b7ca393a427ef28be1424c9e648121f9ddbd0668"} err="failed to get container status \"316d0f27d94ceec4880669d2b7ca393a427ef28be1424c9e648121f9ddbd0668\": rpc error: code = NotFound desc = could not find container \"316d0f27d94ceec4880669d2b7ca393a427ef28be1424c9e648121f9ddbd0668\": container with ID starting with 316d0f27d94ceec4880669d2b7ca393a427ef28be1424c9e648121f9ddbd0668 not found: ID does not exist" Apr 16 10:08:55.834650 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.834573 2525 scope.go:117] "RemoveContainer" containerID="cb5a56fd45e97188495ba37bb1513cb79b081f29d7cec3b8fc46ad74c34d88fd" Apr 16 10:08:55.834875 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:08:55.834844 2525 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb5a56fd45e97188495ba37bb1513cb79b081f29d7cec3b8fc46ad74c34d88fd\": container with ID starting with cb5a56fd45e97188495ba37bb1513cb79b081f29d7cec3b8fc46ad74c34d88fd not found: ID does not exist" containerID="cb5a56fd45e97188495ba37bb1513cb79b081f29d7cec3b8fc46ad74c34d88fd" Apr 16 10:08:55.834925 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.834884 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb5a56fd45e97188495ba37bb1513cb79b081f29d7cec3b8fc46ad74c34d88fd"} err="failed to get container status \"cb5a56fd45e97188495ba37bb1513cb79b081f29d7cec3b8fc46ad74c34d88fd\": rpc error: code = NotFound desc = could not find container \"cb5a56fd45e97188495ba37bb1513cb79b081f29d7cec3b8fc46ad74c34d88fd\": container with ID starting with cb5a56fd45e97188495ba37bb1513cb79b081f29d7cec3b8fc46ad74c34d88fd not found: ID does not exist" Apr 16 10:08:55.834925 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.834907 2525 scope.go:117] "RemoveContainer" containerID="339b3400c0bb06fc0f2a8a62789f3be51a870d030411e418ce53555355b625f3" Apr 16 10:08:55.835161 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:08:55.835146 2525 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"339b3400c0bb06fc0f2a8a62789f3be51a870d030411e418ce53555355b625f3\": container with ID starting with 339b3400c0bb06fc0f2a8a62789f3be51a870d030411e418ce53555355b625f3 not found: ID does not exist" containerID="339b3400c0bb06fc0f2a8a62789f3be51a870d030411e418ce53555355b625f3" Apr 16 10:08:55.835209 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.835163 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"339b3400c0bb06fc0f2a8a62789f3be51a870d030411e418ce53555355b625f3"} err="failed to get container status \"339b3400c0bb06fc0f2a8a62789f3be51a870d030411e418ce53555355b625f3\": rpc error: code = NotFound desc = could not find container \"339b3400c0bb06fc0f2a8a62789f3be51a870d030411e418ce53555355b625f3\": container with ID starting with 339b3400c0bb06fc0f2a8a62789f3be51a870d030411e418ce53555355b625f3 not found: ID does not exist" Apr 16 10:08:55.835209 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.835178 2525 scope.go:117] "RemoveContainer" containerID="26972e6d6d9db1ac164809b0c8939963009a24fcefe32ce6795b533d6adeba44" Apr 16 10:08:55.835460 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:08:55.835404 2525 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26972e6d6d9db1ac164809b0c8939963009a24fcefe32ce6795b533d6adeba44\": container with ID starting with 26972e6d6d9db1ac164809b0c8939963009a24fcefe32ce6795b533d6adeba44 not found: ID does not exist" containerID="26972e6d6d9db1ac164809b0c8939963009a24fcefe32ce6795b533d6adeba44" Apr 16 10:08:55.835460 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.835427 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26972e6d6d9db1ac164809b0c8939963009a24fcefe32ce6795b533d6adeba44"} err="failed to get container status \"26972e6d6d9db1ac164809b0c8939963009a24fcefe32ce6795b533d6adeba44\": rpc error: code = NotFound desc = could not find container \"26972e6d6d9db1ac164809b0c8939963009a24fcefe32ce6795b533d6adeba44\": container with ID starting with 26972e6d6d9db1ac164809b0c8939963009a24fcefe32ce6795b533d6adeba44 not found: ID does not exist" Apr 16 10:08:55.835460 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.835446 2525 scope.go:117] "RemoveContainer" containerID="58744e35cbc812c58be77dbb7900fc9a05be23296696c55acf0bf042c64b3143" Apr 16 10:08:55.835686 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:08:55.835672 2525 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58744e35cbc812c58be77dbb7900fc9a05be23296696c55acf0bf042c64b3143\": container with ID starting with 58744e35cbc812c58be77dbb7900fc9a05be23296696c55acf0bf042c64b3143 not found: ID does not exist" containerID="58744e35cbc812c58be77dbb7900fc9a05be23296696c55acf0bf042c64b3143" Apr 16 10:08:55.835723 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.835690 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58744e35cbc812c58be77dbb7900fc9a05be23296696c55acf0bf042c64b3143"} err="failed to get container status \"58744e35cbc812c58be77dbb7900fc9a05be23296696c55acf0bf042c64b3143\": rpc error: code = NotFound desc = could not find container \"58744e35cbc812c58be77dbb7900fc9a05be23296696c55acf0bf042c64b3143\": container with ID starting with 58744e35cbc812c58be77dbb7900fc9a05be23296696c55acf0bf042c64b3143 not found: ID does not exist" Apr 16 10:08:55.835723 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.835705 2525 scope.go:117] "RemoveContainer" containerID="a8cc558e88fb2c9e121de9ee564ebba42858615191103129ec6019daac751097" Apr 16 10:08:55.835907 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:08:55.835892 2525 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8cc558e88fb2c9e121de9ee564ebba42858615191103129ec6019daac751097\": container with ID starting with a8cc558e88fb2c9e121de9ee564ebba42858615191103129ec6019daac751097 not found: ID does not exist" containerID="a8cc558e88fb2c9e121de9ee564ebba42858615191103129ec6019daac751097" Apr 16 10:08:55.835940 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.835910 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8cc558e88fb2c9e121de9ee564ebba42858615191103129ec6019daac751097"} err="failed to get container status \"a8cc558e88fb2c9e121de9ee564ebba42858615191103129ec6019daac751097\": rpc error: code = NotFound desc = could not find container \"a8cc558e88fb2c9e121de9ee564ebba42858615191103129ec6019daac751097\": container with ID starting with a8cc558e88fb2c9e121de9ee564ebba42858615191103129ec6019daac751097 not found: ID does not exist" Apr 16 10:08:55.835940 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.835922 2525 scope.go:117] "RemoveContainer" containerID="ada6fee44efee4fc6e71f1c6b196591d2b07b6d88910653f6c0d463c4f9db464" Apr 16 10:08:55.836151 ip-10-0-135-1 kubenswrapper[2525]: E0416 10:08:55.836133 2525 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ada6fee44efee4fc6e71f1c6b196591d2b07b6d88910653f6c0d463c4f9db464\": container with ID starting with ada6fee44efee4fc6e71f1c6b196591d2b07b6d88910653f6c0d463c4f9db464 not found: ID does not exist" containerID="ada6fee44efee4fc6e71f1c6b196591d2b07b6d88910653f6c0d463c4f9db464" Apr 16 10:08:55.836216 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.836158 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ada6fee44efee4fc6e71f1c6b196591d2b07b6d88910653f6c0d463c4f9db464"} err="failed to get container status \"ada6fee44efee4fc6e71f1c6b196591d2b07b6d88910653f6c0d463c4f9db464\": rpc error: code = NotFound desc = could not find container \"ada6fee44efee4fc6e71f1c6b196591d2b07b6d88910653f6c0d463c4f9db464\": container with ID starting with ada6fee44efee4fc6e71f1c6b196591d2b07b6d88910653f6c0d463c4f9db464 not found: ID does not exist" Apr 16 10:08:55.836216 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.836180 2525 scope.go:117] "RemoveContainer" containerID="316d0f27d94ceec4880669d2b7ca393a427ef28be1424c9e648121f9ddbd0668" Apr 16 10:08:55.836380 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.836365 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"316d0f27d94ceec4880669d2b7ca393a427ef28be1424c9e648121f9ddbd0668"} err="failed to get container status \"316d0f27d94ceec4880669d2b7ca393a427ef28be1424c9e648121f9ddbd0668\": rpc error: code = NotFound desc = could not find container \"316d0f27d94ceec4880669d2b7ca393a427ef28be1424c9e648121f9ddbd0668\": container with ID starting with 316d0f27d94ceec4880669d2b7ca393a427ef28be1424c9e648121f9ddbd0668 not found: ID does not exist" Apr 16 10:08:55.836444 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.836382 2525 scope.go:117] "RemoveContainer" containerID="cb5a56fd45e97188495ba37bb1513cb79b081f29d7cec3b8fc46ad74c34d88fd" Apr 16 10:08:55.836585 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.836569 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb5a56fd45e97188495ba37bb1513cb79b081f29d7cec3b8fc46ad74c34d88fd"} err="failed to get container status \"cb5a56fd45e97188495ba37bb1513cb79b081f29d7cec3b8fc46ad74c34d88fd\": rpc error: code = NotFound desc = could not find container \"cb5a56fd45e97188495ba37bb1513cb79b081f29d7cec3b8fc46ad74c34d88fd\": container with ID starting with cb5a56fd45e97188495ba37bb1513cb79b081f29d7cec3b8fc46ad74c34d88fd not found: ID does not exist" Apr 16 10:08:55.836639 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.836586 2525 scope.go:117] "RemoveContainer" containerID="339b3400c0bb06fc0f2a8a62789f3be51a870d030411e418ce53555355b625f3" Apr 16 10:08:55.836771 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.836757 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"339b3400c0bb06fc0f2a8a62789f3be51a870d030411e418ce53555355b625f3"} err="failed to get container status \"339b3400c0bb06fc0f2a8a62789f3be51a870d030411e418ce53555355b625f3\": rpc error: code = NotFound desc = could not find container \"339b3400c0bb06fc0f2a8a62789f3be51a870d030411e418ce53555355b625f3\": container with ID starting with 339b3400c0bb06fc0f2a8a62789f3be51a870d030411e418ce53555355b625f3 not found: ID does not exist" Apr 16 10:08:55.836818 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.836772 2525 scope.go:117] "RemoveContainer" containerID="26972e6d6d9db1ac164809b0c8939963009a24fcefe32ce6795b533d6adeba44" Apr 16 10:08:55.836946 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.836930 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26972e6d6d9db1ac164809b0c8939963009a24fcefe32ce6795b533d6adeba44"} err="failed to get container status \"26972e6d6d9db1ac164809b0c8939963009a24fcefe32ce6795b533d6adeba44\": rpc error: code = NotFound desc = could not find container \"26972e6d6d9db1ac164809b0c8939963009a24fcefe32ce6795b533d6adeba44\": container with ID starting with 26972e6d6d9db1ac164809b0c8939963009a24fcefe32ce6795b533d6adeba44 not found: ID does not exist" Apr 16 10:08:55.836996 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.836946 2525 scope.go:117] "RemoveContainer" containerID="58744e35cbc812c58be77dbb7900fc9a05be23296696c55acf0bf042c64b3143" Apr 16 10:08:55.837106 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.837092 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58744e35cbc812c58be77dbb7900fc9a05be23296696c55acf0bf042c64b3143"} err="failed to get container status \"58744e35cbc812c58be77dbb7900fc9a05be23296696c55acf0bf042c64b3143\": rpc error: code = NotFound desc = could not find container \"58744e35cbc812c58be77dbb7900fc9a05be23296696c55acf0bf042c64b3143\": container with ID starting with 58744e35cbc812c58be77dbb7900fc9a05be23296696c55acf0bf042c64b3143 not found: ID does not exist" Apr 16 10:08:55.837106 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.837105 2525 scope.go:117] "RemoveContainer" containerID="a8cc558e88fb2c9e121de9ee564ebba42858615191103129ec6019daac751097" Apr 16 10:08:55.837287 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.837269 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8cc558e88fb2c9e121de9ee564ebba42858615191103129ec6019daac751097"} err="failed to get container status \"a8cc558e88fb2c9e121de9ee564ebba42858615191103129ec6019daac751097\": rpc error: code = NotFound desc = could not find container \"a8cc558e88fb2c9e121de9ee564ebba42858615191103129ec6019daac751097\": container with ID starting with a8cc558e88fb2c9e121de9ee564ebba42858615191103129ec6019daac751097 not found: ID does not exist" Apr 16 10:08:55.837328 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.837287 2525 scope.go:117] "RemoveContainer" containerID="ada6fee44efee4fc6e71f1c6b196591d2b07b6d88910653f6c0d463c4f9db464" Apr 16 10:08:55.837462 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.837447 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ada6fee44efee4fc6e71f1c6b196591d2b07b6d88910653f6c0d463c4f9db464"} err="failed to get container status \"ada6fee44efee4fc6e71f1c6b196591d2b07b6d88910653f6c0d463c4f9db464\": rpc error: code = NotFound desc = could not find container \"ada6fee44efee4fc6e71f1c6b196591d2b07b6d88910653f6c0d463c4f9db464\": container with ID starting with ada6fee44efee4fc6e71f1c6b196591d2b07b6d88910653f6c0d463c4f9db464 not found: ID does not exist" Apr 16 10:08:55.837519 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.837462 2525 scope.go:117] "RemoveContainer" containerID="316d0f27d94ceec4880669d2b7ca393a427ef28be1424c9e648121f9ddbd0668" Apr 16 10:08:55.837653 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.837638 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"316d0f27d94ceec4880669d2b7ca393a427ef28be1424c9e648121f9ddbd0668"} err="failed to get container status \"316d0f27d94ceec4880669d2b7ca393a427ef28be1424c9e648121f9ddbd0668\": rpc error: code = NotFound desc = could not find container \"316d0f27d94ceec4880669d2b7ca393a427ef28be1424c9e648121f9ddbd0668\": container with ID starting with 316d0f27d94ceec4880669d2b7ca393a427ef28be1424c9e648121f9ddbd0668 not found: ID does not exist" Apr 16 10:08:55.837699 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.837653 2525 scope.go:117] "RemoveContainer" containerID="cb5a56fd45e97188495ba37bb1513cb79b081f29d7cec3b8fc46ad74c34d88fd" Apr 16 10:08:55.837834 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.837813 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb5a56fd45e97188495ba37bb1513cb79b081f29d7cec3b8fc46ad74c34d88fd"} err="failed to get container status \"cb5a56fd45e97188495ba37bb1513cb79b081f29d7cec3b8fc46ad74c34d88fd\": rpc error: code = NotFound desc = could not find container \"cb5a56fd45e97188495ba37bb1513cb79b081f29d7cec3b8fc46ad74c34d88fd\": container with ID starting with cb5a56fd45e97188495ba37bb1513cb79b081f29d7cec3b8fc46ad74c34d88fd not found: ID does not exist" Apr 16 10:08:55.837880 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.837834 2525 scope.go:117] "RemoveContainer" containerID="339b3400c0bb06fc0f2a8a62789f3be51a870d030411e418ce53555355b625f3" Apr 16 10:08:55.838008 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.837993 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"339b3400c0bb06fc0f2a8a62789f3be51a870d030411e418ce53555355b625f3"} err="failed to get container status \"339b3400c0bb06fc0f2a8a62789f3be51a870d030411e418ce53555355b625f3\": rpc error: code = NotFound desc = could not find container \"339b3400c0bb06fc0f2a8a62789f3be51a870d030411e418ce53555355b625f3\": container with ID starting with 339b3400c0bb06fc0f2a8a62789f3be51a870d030411e418ce53555355b625f3 not found: ID does not exist" Apr 16 10:08:55.838055 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.838007 2525 scope.go:117] "RemoveContainer" containerID="26972e6d6d9db1ac164809b0c8939963009a24fcefe32ce6795b533d6adeba44" Apr 16 10:08:55.838246 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.838207 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26972e6d6d9db1ac164809b0c8939963009a24fcefe32ce6795b533d6adeba44"} err="failed to get container status \"26972e6d6d9db1ac164809b0c8939963009a24fcefe32ce6795b533d6adeba44\": rpc error: code = NotFound desc = could not find container \"26972e6d6d9db1ac164809b0c8939963009a24fcefe32ce6795b533d6adeba44\": container with ID starting with 26972e6d6d9db1ac164809b0c8939963009a24fcefe32ce6795b533d6adeba44 not found: ID does not exist" Apr 16 10:08:55.838246 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.838236 2525 scope.go:117] "RemoveContainer" containerID="58744e35cbc812c58be77dbb7900fc9a05be23296696c55acf0bf042c64b3143" Apr 16 10:08:55.838429 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.838414 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58744e35cbc812c58be77dbb7900fc9a05be23296696c55acf0bf042c64b3143"} err="failed to get container status \"58744e35cbc812c58be77dbb7900fc9a05be23296696c55acf0bf042c64b3143\": rpc error: code = NotFound desc = could not find container \"58744e35cbc812c58be77dbb7900fc9a05be23296696c55acf0bf042c64b3143\": container with ID starting with 58744e35cbc812c58be77dbb7900fc9a05be23296696c55acf0bf042c64b3143 not found: ID does not exist" Apr 16 10:08:55.838495 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.838429 2525 scope.go:117] "RemoveContainer" containerID="a8cc558e88fb2c9e121de9ee564ebba42858615191103129ec6019daac751097" Apr 16 10:08:55.838659 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.838642 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8cc558e88fb2c9e121de9ee564ebba42858615191103129ec6019daac751097"} err="failed to get container status \"a8cc558e88fb2c9e121de9ee564ebba42858615191103129ec6019daac751097\": rpc error: code = NotFound desc = could not find container \"a8cc558e88fb2c9e121de9ee564ebba42858615191103129ec6019daac751097\": container with ID starting with a8cc558e88fb2c9e121de9ee564ebba42858615191103129ec6019daac751097 not found: ID does not exist" Apr 16 10:08:55.838706 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.838660 2525 scope.go:117] "RemoveContainer" containerID="ada6fee44efee4fc6e71f1c6b196591d2b07b6d88910653f6c0d463c4f9db464" Apr 16 10:08:55.838859 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.838844 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ada6fee44efee4fc6e71f1c6b196591d2b07b6d88910653f6c0d463c4f9db464"} err="failed to get container status \"ada6fee44efee4fc6e71f1c6b196591d2b07b6d88910653f6c0d463c4f9db464\": rpc error: code = NotFound desc = could not find container \"ada6fee44efee4fc6e71f1c6b196591d2b07b6d88910653f6c0d463c4f9db464\": container with ID starting with ada6fee44efee4fc6e71f1c6b196591d2b07b6d88910653f6c0d463c4f9db464 not found: ID does not exist" Apr 16 10:08:55.838898 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.838859 2525 scope.go:117] "RemoveContainer" containerID="316d0f27d94ceec4880669d2b7ca393a427ef28be1424c9e648121f9ddbd0668" Apr 16 10:08:55.839028 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.839015 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"316d0f27d94ceec4880669d2b7ca393a427ef28be1424c9e648121f9ddbd0668"} err="failed to get container status \"316d0f27d94ceec4880669d2b7ca393a427ef28be1424c9e648121f9ddbd0668\": rpc error: code = NotFound desc = could not find container \"316d0f27d94ceec4880669d2b7ca393a427ef28be1424c9e648121f9ddbd0668\": container with ID starting with 316d0f27d94ceec4880669d2b7ca393a427ef28be1424c9e648121f9ddbd0668 not found: ID does not exist" Apr 16 10:08:55.839072 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.839028 2525 scope.go:117] "RemoveContainer" containerID="cb5a56fd45e97188495ba37bb1513cb79b081f29d7cec3b8fc46ad74c34d88fd" Apr 16 10:08:55.839229 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.839214 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb5a56fd45e97188495ba37bb1513cb79b081f29d7cec3b8fc46ad74c34d88fd"} err="failed to get container status \"cb5a56fd45e97188495ba37bb1513cb79b081f29d7cec3b8fc46ad74c34d88fd\": rpc error: code = NotFound desc = could not find container \"cb5a56fd45e97188495ba37bb1513cb79b081f29d7cec3b8fc46ad74c34d88fd\": container with ID starting with cb5a56fd45e97188495ba37bb1513cb79b081f29d7cec3b8fc46ad74c34d88fd not found: ID does not exist" Apr 16 10:08:55.839269 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.839230 2525 scope.go:117] "RemoveContainer" containerID="339b3400c0bb06fc0f2a8a62789f3be51a870d030411e418ce53555355b625f3" Apr 16 10:08:55.839416 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.839396 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"339b3400c0bb06fc0f2a8a62789f3be51a870d030411e418ce53555355b625f3"} err="failed to get container status \"339b3400c0bb06fc0f2a8a62789f3be51a870d030411e418ce53555355b625f3\": rpc error: code = NotFound desc = could not find container \"339b3400c0bb06fc0f2a8a62789f3be51a870d030411e418ce53555355b625f3\": container with ID starting with 339b3400c0bb06fc0f2a8a62789f3be51a870d030411e418ce53555355b625f3 not found: ID does not exist" Apr 16 10:08:55.839459 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.839419 2525 scope.go:117] "RemoveContainer" containerID="26972e6d6d9db1ac164809b0c8939963009a24fcefe32ce6795b533d6adeba44" Apr 16 10:08:55.839621 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.839602 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26972e6d6d9db1ac164809b0c8939963009a24fcefe32ce6795b533d6adeba44"} err="failed to get container status \"26972e6d6d9db1ac164809b0c8939963009a24fcefe32ce6795b533d6adeba44\": rpc error: code = NotFound desc = could not find container \"26972e6d6d9db1ac164809b0c8939963009a24fcefe32ce6795b533d6adeba44\": container with ID starting with 26972e6d6d9db1ac164809b0c8939963009a24fcefe32ce6795b533d6adeba44 not found: ID does not exist" Apr 16 10:08:55.839671 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.839621 2525 scope.go:117] "RemoveContainer" containerID="58744e35cbc812c58be77dbb7900fc9a05be23296696c55acf0bf042c64b3143" Apr 16 10:08:55.839782 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.839769 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58744e35cbc812c58be77dbb7900fc9a05be23296696c55acf0bf042c64b3143"} err="failed to get container status \"58744e35cbc812c58be77dbb7900fc9a05be23296696c55acf0bf042c64b3143\": rpc error: code = NotFound desc = could not find container \"58744e35cbc812c58be77dbb7900fc9a05be23296696c55acf0bf042c64b3143\": container with ID starting with 58744e35cbc812c58be77dbb7900fc9a05be23296696c55acf0bf042c64b3143 not found: ID does not exist" Apr 16 10:08:55.839819 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.839782 2525 scope.go:117] "RemoveContainer" containerID="a8cc558e88fb2c9e121de9ee564ebba42858615191103129ec6019daac751097" Apr 16 10:08:55.839945 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.839928 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8cc558e88fb2c9e121de9ee564ebba42858615191103129ec6019daac751097"} err="failed to get container status \"a8cc558e88fb2c9e121de9ee564ebba42858615191103129ec6019daac751097\": rpc error: code = NotFound desc = could not find container \"a8cc558e88fb2c9e121de9ee564ebba42858615191103129ec6019daac751097\": container with ID starting with a8cc558e88fb2c9e121de9ee564ebba42858615191103129ec6019daac751097 not found: ID does not exist" Apr 16 10:08:55.840005 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.839945 2525 scope.go:117] "RemoveContainer" containerID="ada6fee44efee4fc6e71f1c6b196591d2b07b6d88910653f6c0d463c4f9db464" Apr 16 10:08:55.840185 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.840166 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ada6fee44efee4fc6e71f1c6b196591d2b07b6d88910653f6c0d463c4f9db464"} err="failed to get container status \"ada6fee44efee4fc6e71f1c6b196591d2b07b6d88910653f6c0d463c4f9db464\": rpc error: code = NotFound desc = could not find container \"ada6fee44efee4fc6e71f1c6b196591d2b07b6d88910653f6c0d463c4f9db464\": container with ID starting with ada6fee44efee4fc6e71f1c6b196591d2b07b6d88910653f6c0d463c4f9db464 not found: ID does not exist" Apr 16 10:08:55.840233 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.840185 2525 scope.go:117] "RemoveContainer" containerID="316d0f27d94ceec4880669d2b7ca393a427ef28be1424c9e648121f9ddbd0668" Apr 16 10:08:55.840407 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.840388 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"316d0f27d94ceec4880669d2b7ca393a427ef28be1424c9e648121f9ddbd0668"} err="failed to get container status \"316d0f27d94ceec4880669d2b7ca393a427ef28be1424c9e648121f9ddbd0668\": rpc error: code = NotFound desc = could not find container \"316d0f27d94ceec4880669d2b7ca393a427ef28be1424c9e648121f9ddbd0668\": container with ID starting with 316d0f27d94ceec4880669d2b7ca393a427ef28be1424c9e648121f9ddbd0668 not found: ID does not exist" Apr 16 10:08:55.840475 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.840409 2525 scope.go:117] "RemoveContainer" containerID="cb5a56fd45e97188495ba37bb1513cb79b081f29d7cec3b8fc46ad74c34d88fd" Apr 16 10:08:55.840662 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.840645 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb5a56fd45e97188495ba37bb1513cb79b081f29d7cec3b8fc46ad74c34d88fd"} err="failed to get container status \"cb5a56fd45e97188495ba37bb1513cb79b081f29d7cec3b8fc46ad74c34d88fd\": rpc error: code = NotFound desc = could not find container \"cb5a56fd45e97188495ba37bb1513cb79b081f29d7cec3b8fc46ad74c34d88fd\": container with ID starting with cb5a56fd45e97188495ba37bb1513cb79b081f29d7cec3b8fc46ad74c34d88fd not found: ID does not exist" Apr 16 10:08:55.840713 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.840662 2525 scope.go:117] "RemoveContainer" containerID="339b3400c0bb06fc0f2a8a62789f3be51a870d030411e418ce53555355b625f3" Apr 16 10:08:55.840877 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.840860 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"339b3400c0bb06fc0f2a8a62789f3be51a870d030411e418ce53555355b625f3"} err="failed to get container status \"339b3400c0bb06fc0f2a8a62789f3be51a870d030411e418ce53555355b625f3\": rpc error: code = NotFound desc = could not find container \"339b3400c0bb06fc0f2a8a62789f3be51a870d030411e418ce53555355b625f3\": container with ID starting with 339b3400c0bb06fc0f2a8a62789f3be51a870d030411e418ce53555355b625f3 not found: ID does not exist" Apr 16 10:08:55.840920 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.840879 2525 scope.go:117] "RemoveContainer" containerID="26972e6d6d9db1ac164809b0c8939963009a24fcefe32ce6795b533d6adeba44" Apr 16 10:08:55.841054 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.841037 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26972e6d6d9db1ac164809b0c8939963009a24fcefe32ce6795b533d6adeba44"} err="failed to get container status \"26972e6d6d9db1ac164809b0c8939963009a24fcefe32ce6795b533d6adeba44\": rpc error: code = NotFound desc = could not find container \"26972e6d6d9db1ac164809b0c8939963009a24fcefe32ce6795b533d6adeba44\": container with ID starting with 26972e6d6d9db1ac164809b0c8939963009a24fcefe32ce6795b533d6adeba44 not found: ID does not exist" Apr 16 10:08:55.841118 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.841056 2525 scope.go:117] "RemoveContainer" containerID="58744e35cbc812c58be77dbb7900fc9a05be23296696c55acf0bf042c64b3143" Apr 16 10:08:55.841239 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.841223 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58744e35cbc812c58be77dbb7900fc9a05be23296696c55acf0bf042c64b3143"} err="failed to get container status \"58744e35cbc812c58be77dbb7900fc9a05be23296696c55acf0bf042c64b3143\": rpc error: code = NotFound desc = could not find container \"58744e35cbc812c58be77dbb7900fc9a05be23296696c55acf0bf042c64b3143\": container with ID starting with 58744e35cbc812c58be77dbb7900fc9a05be23296696c55acf0bf042c64b3143 not found: ID does not exist" Apr 16 10:08:55.841286 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.841238 2525 scope.go:117] "RemoveContainer" containerID="a8cc558e88fb2c9e121de9ee564ebba42858615191103129ec6019daac751097" Apr 16 10:08:55.841449 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.841429 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8cc558e88fb2c9e121de9ee564ebba42858615191103129ec6019daac751097"} err="failed to get container status \"a8cc558e88fb2c9e121de9ee564ebba42858615191103129ec6019daac751097\": rpc error: code = NotFound desc = could not find container \"a8cc558e88fb2c9e121de9ee564ebba42858615191103129ec6019daac751097\": container with ID starting with a8cc558e88fb2c9e121de9ee564ebba42858615191103129ec6019daac751097 not found: ID does not exist" Apr 16 10:08:55.841500 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.841449 2525 scope.go:117] "RemoveContainer" containerID="ada6fee44efee4fc6e71f1c6b196591d2b07b6d88910653f6c0d463c4f9db464" Apr 16 10:08:55.841719 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.841702 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ada6fee44efee4fc6e71f1c6b196591d2b07b6d88910653f6c0d463c4f9db464"} err="failed to get container status \"ada6fee44efee4fc6e71f1c6b196591d2b07b6d88910653f6c0d463c4f9db464\": rpc error: code = NotFound desc = could not find container \"ada6fee44efee4fc6e71f1c6b196591d2b07b6d88910653f6c0d463c4f9db464\": container with ID starting with ada6fee44efee4fc6e71f1c6b196591d2b07b6d88910653f6c0d463c4f9db464 not found: ID does not exist" Apr 16 10:08:55.841770 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.841719 2525 scope.go:117] "RemoveContainer" containerID="316d0f27d94ceec4880669d2b7ca393a427ef28be1424c9e648121f9ddbd0668" Apr 16 10:08:55.841934 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.841916 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"316d0f27d94ceec4880669d2b7ca393a427ef28be1424c9e648121f9ddbd0668"} err="failed to get container status \"316d0f27d94ceec4880669d2b7ca393a427ef28be1424c9e648121f9ddbd0668\": rpc error: code = NotFound desc = could not find container \"316d0f27d94ceec4880669d2b7ca393a427ef28be1424c9e648121f9ddbd0668\": container with ID starting with 316d0f27d94ceec4880669d2b7ca393a427ef28be1424c9e648121f9ddbd0668 not found: ID does not exist" Apr 16 10:08:55.842025 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.841938 2525 scope.go:117] "RemoveContainer" containerID="cb5a56fd45e97188495ba37bb1513cb79b081f29d7cec3b8fc46ad74c34d88fd" Apr 16 10:08:55.842131 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.842114 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb5a56fd45e97188495ba37bb1513cb79b081f29d7cec3b8fc46ad74c34d88fd"} err="failed to get container status \"cb5a56fd45e97188495ba37bb1513cb79b081f29d7cec3b8fc46ad74c34d88fd\": rpc error: code = NotFound desc = could not find container \"cb5a56fd45e97188495ba37bb1513cb79b081f29d7cec3b8fc46ad74c34d88fd\": container with ID starting with cb5a56fd45e97188495ba37bb1513cb79b081f29d7cec3b8fc46ad74c34d88fd not found: ID does not exist" Apr 16 10:08:55.842183 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.842132 2525 scope.go:117] "RemoveContainer" containerID="339b3400c0bb06fc0f2a8a62789f3be51a870d030411e418ce53555355b625f3" Apr 16 10:08:55.842353 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.842336 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"339b3400c0bb06fc0f2a8a62789f3be51a870d030411e418ce53555355b625f3"} err="failed to get container status \"339b3400c0bb06fc0f2a8a62789f3be51a870d030411e418ce53555355b625f3\": rpc error: code = NotFound desc = could not find container \"339b3400c0bb06fc0f2a8a62789f3be51a870d030411e418ce53555355b625f3\": container with ID starting with 339b3400c0bb06fc0f2a8a62789f3be51a870d030411e418ce53555355b625f3 not found: ID does not exist" Apr 16 10:08:55.842395 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.842354 2525 scope.go:117] "RemoveContainer" containerID="26972e6d6d9db1ac164809b0c8939963009a24fcefe32ce6795b533d6adeba44" Apr 16 10:08:55.842554 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.842537 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26972e6d6d9db1ac164809b0c8939963009a24fcefe32ce6795b533d6adeba44"} err="failed to get container status \"26972e6d6d9db1ac164809b0c8939963009a24fcefe32ce6795b533d6adeba44\": rpc error: code = NotFound desc = could not find container \"26972e6d6d9db1ac164809b0c8939963009a24fcefe32ce6795b533d6adeba44\": container with ID starting with 26972e6d6d9db1ac164809b0c8939963009a24fcefe32ce6795b533d6adeba44 not found: ID does not exist" Apr 16 10:08:55.842599 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.842555 2525 scope.go:117] "RemoveContainer" containerID="58744e35cbc812c58be77dbb7900fc9a05be23296696c55acf0bf042c64b3143" Apr 16 10:08:55.842769 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.842752 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58744e35cbc812c58be77dbb7900fc9a05be23296696c55acf0bf042c64b3143"} err="failed to get container status \"58744e35cbc812c58be77dbb7900fc9a05be23296696c55acf0bf042c64b3143\": rpc error: code = NotFound desc = could not find container \"58744e35cbc812c58be77dbb7900fc9a05be23296696c55acf0bf042c64b3143\": container with ID starting with 58744e35cbc812c58be77dbb7900fc9a05be23296696c55acf0bf042c64b3143 not found: ID does not exist" Apr 16 10:08:55.842811 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.842769 2525 scope.go:117] "RemoveContainer" containerID="a8cc558e88fb2c9e121de9ee564ebba42858615191103129ec6019daac751097" Apr 16 10:08:55.842972 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.842957 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8cc558e88fb2c9e121de9ee564ebba42858615191103129ec6019daac751097"} err="failed to get container status \"a8cc558e88fb2c9e121de9ee564ebba42858615191103129ec6019daac751097\": rpc error: code = NotFound desc = could not find container \"a8cc558e88fb2c9e121de9ee564ebba42858615191103129ec6019daac751097\": container with ID starting with a8cc558e88fb2c9e121de9ee564ebba42858615191103129ec6019daac751097 not found: ID does not exist" Apr 16 10:08:55.842972 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.842971 2525 scope.go:117] "RemoveContainer" containerID="ada6fee44efee4fc6e71f1c6b196591d2b07b6d88910653f6c0d463c4f9db464" Apr 16 10:08:55.843153 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.843136 2525 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ada6fee44efee4fc6e71f1c6b196591d2b07b6d88910653f6c0d463c4f9db464"} err="failed to get container status \"ada6fee44efee4fc6e71f1c6b196591d2b07b6d88910653f6c0d463c4f9db464\": rpc error: code = NotFound desc = could not find container \"ada6fee44efee4fc6e71f1c6b196591d2b07b6d88910653f6c0d463c4f9db464\": container with ID starting with ada6fee44efee4fc6e71f1c6b196591d2b07b6d88910653f6c0d463c4f9db464 not found: ID does not exist" Apr 16 10:08:55.846964 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.846943 2525 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 10:08:55.847213 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.847202 2525 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a18c918-39f6-4be5-b45e-3afec94f3a51" containerName="init-config-reloader" Apr 16 10:08:55.847254 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.847215 2525 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a18c918-39f6-4be5-b45e-3afec94f3a51" containerName="init-config-reloader" Apr 16 10:08:55.847254 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.847229 2525 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a18c918-39f6-4be5-b45e-3afec94f3a51" containerName="prometheus" Apr 16 10:08:55.847254 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.847235 2525 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a18c918-39f6-4be5-b45e-3afec94f3a51" containerName="prometheus" Apr 16 10:08:55.847254 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.847241 2525 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a18c918-39f6-4be5-b45e-3afec94f3a51" containerName="kube-rbac-proxy-thanos" Apr 16 10:08:55.847254 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.847247 2525 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a18c918-39f6-4be5-b45e-3afec94f3a51" containerName="kube-rbac-proxy-thanos" Apr 16 10:08:55.847392 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.847258 2525 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5a082bd3-2d68-4bba-a387-faa89f319dac" containerName="registry" Apr 16 10:08:55.847392 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.847264 2525 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a082bd3-2d68-4bba-a387-faa89f319dac" containerName="registry" Apr 16 10:08:55.847392 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.847272 2525 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a18c918-39f6-4be5-b45e-3afec94f3a51" containerName="kube-rbac-proxy" Apr 16 10:08:55.847392 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.847277 2525 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a18c918-39f6-4be5-b45e-3afec94f3a51" containerName="kube-rbac-proxy" Apr 16 10:08:55.847392 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.847284 2525 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a18c918-39f6-4be5-b45e-3afec94f3a51" containerName="thanos-sidecar" Apr 16 10:08:55.847392 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.847289 2525 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a18c918-39f6-4be5-b45e-3afec94f3a51" containerName="thanos-sidecar" Apr 16 10:08:55.847392 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.847298 2525 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a18c918-39f6-4be5-b45e-3afec94f3a51" containerName="kube-rbac-proxy-web" Apr 16 10:08:55.847392 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.847303 2525 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a18c918-39f6-4be5-b45e-3afec94f3a51" containerName="kube-rbac-proxy-web" Apr 16 10:08:55.847392 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.847310 2525 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a18c918-39f6-4be5-b45e-3afec94f3a51" containerName="config-reloader" Apr 16 10:08:55.847392 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.847315 2525 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a18c918-39f6-4be5-b45e-3afec94f3a51" containerName="config-reloader" Apr 16 10:08:55.847392 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.847354 2525 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a18c918-39f6-4be5-b45e-3afec94f3a51" containerName="thanos-sidecar" Apr 16 10:08:55.847392 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.847363 2525 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a18c918-39f6-4be5-b45e-3afec94f3a51" containerName="kube-rbac-proxy-web" Apr 16 10:08:55.847392 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.847369 2525 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a18c918-39f6-4be5-b45e-3afec94f3a51" containerName="prometheus" Apr 16 10:08:55.847392 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.847375 2525 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a18c918-39f6-4be5-b45e-3afec94f3a51" containerName="kube-rbac-proxy-thanos" Apr 16 10:08:55.847392 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.847383 2525 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a18c918-39f6-4be5-b45e-3afec94f3a51" containerName="config-reloader" Apr 16 10:08:55.847392 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.847389 2525 memory_manager.go:356] "RemoveStaleState removing state" podUID="5a082bd3-2d68-4bba-a387-faa89f319dac" containerName="registry" Apr 16 10:08:55.847392 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.847396 2525 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a18c918-39f6-4be5-b45e-3afec94f3a51" containerName="kube-rbac-proxy" Apr 16 10:08:55.852720 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.852703 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:55.854958 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.854941 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 10:08:55.855075 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.855049 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 10:08:55.855989 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.855976 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 10:08:55.856932 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.856918 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 10:08:55.858790 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.858770 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 10:08:55.858903 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.858807 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 10:08:55.858903 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.858814 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-3m7ce9s2mumde\"" Apr 16 10:08:55.859173 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.859155 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-n8w89\"" Apr 16 10:08:55.859238 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.859214 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 10:08:55.859538 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.859389 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 10:08:55.859689 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.859664 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 10:08:55.859888 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.859870 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 10:08:55.863577 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.863104 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 10:08:55.867747 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.867573 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 10:08:55.868779 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.868760 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 10:08:55.917578 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.917547 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/75e4c1e8-fa1c-41fb-80d5-35856f420384-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:55.917765 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.917582 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/75e4c1e8-fa1c-41fb-80d5-35856f420384-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:55.917765 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.917606 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/75e4c1e8-fa1c-41fb-80d5-35856f420384-config\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:55.917765 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.917657 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/75e4c1e8-fa1c-41fb-80d5-35856f420384-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:55.917765 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.917695 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/75e4c1e8-fa1c-41fb-80d5-35856f420384-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:55.917765 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.917729 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/75e4c1e8-fa1c-41fb-80d5-35856f420384-config-out\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:55.917765 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.917753 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/75e4c1e8-fa1c-41fb-80d5-35856f420384-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:55.918048 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.917774 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/75e4c1e8-fa1c-41fb-80d5-35856f420384-web-config\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:55.918048 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.917801 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/75e4c1e8-fa1c-41fb-80d5-35856f420384-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:55.918048 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.917834 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75e4c1e8-fa1c-41fb-80d5-35856f420384-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:55.918048 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.917857 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75e4c1e8-fa1c-41fb-80d5-35856f420384-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:55.918048 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.917887 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/75e4c1e8-fa1c-41fb-80d5-35856f420384-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:55.918048 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.917913 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/75e4c1e8-fa1c-41fb-80d5-35856f420384-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:55.918048 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.917950 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/75e4c1e8-fa1c-41fb-80d5-35856f420384-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:55.918048 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.918003 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/75e4c1e8-fa1c-41fb-80d5-35856f420384-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:55.918048 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.918029 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gmf5\" (UniqueName: \"kubernetes.io/projected/75e4c1e8-fa1c-41fb-80d5-35856f420384-kube-api-access-2gmf5\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:55.918314 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.918067 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/75e4c1e8-fa1c-41fb-80d5-35856f420384-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:55.918314 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.918088 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75e4c1e8-fa1c-41fb-80d5-35856f420384-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:55.993239 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:55.993194 2525 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a18c918-39f6-4be5-b45e-3afec94f3a51" path="/var/lib/kubelet/pods/0a18c918-39f6-4be5-b45e-3afec94f3a51/volumes" Apr 16 10:08:56.018894 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.018863 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/75e4c1e8-fa1c-41fb-80d5-35856f420384-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:56.018894 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.018908 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/75e4c1e8-fa1c-41fb-80d5-35856f420384-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:56.019143 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.018945 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/75e4c1e8-fa1c-41fb-80d5-35856f420384-config\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:56.019143 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.018978 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/75e4c1e8-fa1c-41fb-80d5-35856f420384-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:56.019143 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.019000 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/75e4c1e8-fa1c-41fb-80d5-35856f420384-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:56.019284 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.019159 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/75e4c1e8-fa1c-41fb-80d5-35856f420384-config-out\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:56.019284 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.019203 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/75e4c1e8-fa1c-41fb-80d5-35856f420384-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:56.019284 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.019230 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/75e4c1e8-fa1c-41fb-80d5-35856f420384-web-config\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:56.019284 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.019259 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/75e4c1e8-fa1c-41fb-80d5-35856f420384-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:56.019907 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.019297 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75e4c1e8-fa1c-41fb-80d5-35856f420384-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:56.019907 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.019308 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/75e4c1e8-fa1c-41fb-80d5-35856f420384-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:56.019907 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.019325 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75e4c1e8-fa1c-41fb-80d5-35856f420384-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:56.019907 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.019358 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/75e4c1e8-fa1c-41fb-80d5-35856f420384-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:56.019907 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.019380 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/75e4c1e8-fa1c-41fb-80d5-35856f420384-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:56.019907 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.019424 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/75e4c1e8-fa1c-41fb-80d5-35856f420384-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:56.019907 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.019464 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/75e4c1e8-fa1c-41fb-80d5-35856f420384-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:56.019907 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.019488 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2gmf5\" (UniqueName: \"kubernetes.io/projected/75e4c1e8-fa1c-41fb-80d5-35856f420384-kube-api-access-2gmf5\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:56.019907 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.019552 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/75e4c1e8-fa1c-41fb-80d5-35856f420384-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:56.019907 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.019588 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75e4c1e8-fa1c-41fb-80d5-35856f420384-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:56.020549 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.020520 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75e4c1e8-fa1c-41fb-80d5-35856f420384-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:56.020636 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.020604 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75e4c1e8-fa1c-41fb-80d5-35856f420384-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:56.021298 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.021265 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75e4c1e8-fa1c-41fb-80d5-35856f420384-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:56.022211 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.022187 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/75e4c1e8-fa1c-41fb-80d5-35856f420384-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:56.022350 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.022330 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/75e4c1e8-fa1c-41fb-80d5-35856f420384-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:56.022548 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.022527 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/75e4c1e8-fa1c-41fb-80d5-35856f420384-config-out\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:56.023627 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.023595 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/75e4c1e8-fa1c-41fb-80d5-35856f420384-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:56.023627 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.023611 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/75e4c1e8-fa1c-41fb-80d5-35856f420384-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:56.023847 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.023829 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/75e4c1e8-fa1c-41fb-80d5-35856f420384-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:56.024011 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.023988 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/75e4c1e8-fa1c-41fb-80d5-35856f420384-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:56.024234 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.024207 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/75e4c1e8-fa1c-41fb-80d5-35856f420384-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:56.024324 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.024220 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/75e4c1e8-fa1c-41fb-80d5-35856f420384-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:56.024410 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.024393 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/75e4c1e8-fa1c-41fb-80d5-35856f420384-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:56.024453 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.024394 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/75e4c1e8-fa1c-41fb-80d5-35856f420384-config\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:56.024704 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.024683 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/75e4c1e8-fa1c-41fb-80d5-35856f420384-web-config\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:56.025398 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.025382 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/75e4c1e8-fa1c-41fb-80d5-35856f420384-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:56.029005 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.028984 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gmf5\" (UniqueName: \"kubernetes.io/projected/75e4c1e8-fa1c-41fb-80d5-35856f420384-kube-api-access-2gmf5\") pod \"prometheus-k8s-0\" (UID: \"75e4c1e8-fa1c-41fb-80d5-35856f420384\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:56.167049 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.167011 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:08:56.299417 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.299393 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 10:08:56.302246 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:08:56.302212 2525 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75e4c1e8_fa1c_41fb_80d5_35856f420384.slice/crio-c1ab67d4b492b66c152e666ba64bc936d26c9d5cbc3f7ef410b05ddf5306e852 WatchSource:0}: Error finding container c1ab67d4b492b66c152e666ba64bc936d26c9d5cbc3f7ef410b05ddf5306e852: Status 404 returned error can't find the container with id c1ab67d4b492b66c152e666ba64bc936d26c9d5cbc3f7ef410b05ddf5306e852 Apr 16 10:08:56.793701 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.793612 2525 generic.go:358] "Generic (PLEG): container finished" podID="75e4c1e8-fa1c-41fb-80d5-35856f420384" containerID="53cef1d3476e6b00d64539e7d452a7dafe78e53cb4e25184d8b7e87940777d89" exitCode=0 Apr 16 10:08:56.793841 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.793696 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75e4c1e8-fa1c-41fb-80d5-35856f420384","Type":"ContainerDied","Data":"53cef1d3476e6b00d64539e7d452a7dafe78e53cb4e25184d8b7e87940777d89"} Apr 16 10:08:56.793841 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:56.793730 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75e4c1e8-fa1c-41fb-80d5-35856f420384","Type":"ContainerStarted","Data":"c1ab67d4b492b66c152e666ba64bc936d26c9d5cbc3f7ef410b05ddf5306e852"} Apr 16 10:08:57.800825 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:57.800794 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75e4c1e8-fa1c-41fb-80d5-35856f420384","Type":"ContainerStarted","Data":"cdc521f32aeeba9542f6a805deeadfb5509623c02f7bfd59e8d5f457a8841bc1"} Apr 16 10:08:57.800825 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:57.800828 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75e4c1e8-fa1c-41fb-80d5-35856f420384","Type":"ContainerStarted","Data":"2b3a7375d277de437bb36d479e7ccbd4418795813450ef507ad36c82b83a1df3"} Apr 16 10:08:57.801197 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:57.800838 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75e4c1e8-fa1c-41fb-80d5-35856f420384","Type":"ContainerStarted","Data":"87d1ea5ba72d0745608e750b9847857e00f1f8c269a0bb37e25d613124afe7ea"} Apr 16 10:08:57.801197 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:57.800846 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75e4c1e8-fa1c-41fb-80d5-35856f420384","Type":"ContainerStarted","Data":"77fa67f33e75ea29c8829e236a0a33889713ce2d5da0c78b87265f41d0d1d660"} Apr 16 10:08:57.801197 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:57.800854 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75e4c1e8-fa1c-41fb-80d5-35856f420384","Type":"ContainerStarted","Data":"cba9e4bdfcafd47015632f38b3bfe5b72ab1c40677d0b109b6c583d3c9a0c880"} Apr 16 10:08:57.801197 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:57.800862 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75e4c1e8-fa1c-41fb-80d5-35856f420384","Type":"ContainerStarted","Data":"98a6a8d8c85fcc65446b7d7e918411cd1b66c88917821b2192566f597a9715f1"} Apr 16 10:08:57.834044 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:08:57.833989 2525 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.833962921 podStartE2EDuration="2.833962921s" podCreationTimestamp="2026-04-16 10:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 10:08:57.832310716 +0000 UTC m=+260.431758671" watchObservedRunningTime="2026-04-16 10:08:57.833962921 +0000 UTC m=+260.433410878" Apr 16 10:09:01.168060 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:09:01.168026 2525 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:09:37.866456 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:09:37.866428 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-c2rf2_8603601a-c2cf-4413-9cb5-1801baafd774/console-operator/2.log" Apr 16 10:09:37.866968 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:09:37.866505 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-c2rf2_8603601a-c2cf-4413-9cb5-1801baafd774/console-operator/2.log" Apr 16 10:09:37.876310 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:09:37.876290 2525 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 10:09:56.167555 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:09:56.167497 2525 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:09:56.182748 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:09:56.182709 2525 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:09:56.987237 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:09:56.987213 2525 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 10:10:23.847927 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:10:23.847892 2525 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-l482m"] Apr 16 10:10:23.851374 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:10:23.851357 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-l482m" Apr 16 10:10:23.853862 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:10:23.853837 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 10:10:23.854692 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:10:23.854670 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 10:10:23.854793 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:10:23.854707 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-khnvb\"" Apr 16 10:10:23.861163 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:10:23.861144 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-l482m"] Apr 16 10:10:23.948851 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:10:23.948817 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdfbb\" (UniqueName: \"kubernetes.io/projected/ae086d08-7560-49a5-92bd-8e4184cadf8e-kube-api-access-zdfbb\") pod \"cert-manager-cainjector-8966b78d4-l482m\" (UID: \"ae086d08-7560-49a5-92bd-8e4184cadf8e\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-l482m" Apr 16 10:10:23.948989 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:10:23.948882 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ae086d08-7560-49a5-92bd-8e4184cadf8e-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-l482m\" (UID: \"ae086d08-7560-49a5-92bd-8e4184cadf8e\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-l482m" Apr 16 10:10:24.049893 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:10:24.049858 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zdfbb\" (UniqueName: \"kubernetes.io/projected/ae086d08-7560-49a5-92bd-8e4184cadf8e-kube-api-access-zdfbb\") pod \"cert-manager-cainjector-8966b78d4-l482m\" (UID: \"ae086d08-7560-49a5-92bd-8e4184cadf8e\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-l482m" Apr 16 10:10:24.050067 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:10:24.049932 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ae086d08-7560-49a5-92bd-8e4184cadf8e-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-l482m\" (UID: \"ae086d08-7560-49a5-92bd-8e4184cadf8e\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-l482m" Apr 16 10:10:24.058971 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:10:24.058947 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ae086d08-7560-49a5-92bd-8e4184cadf8e-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-l482m\" (UID: \"ae086d08-7560-49a5-92bd-8e4184cadf8e\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-l482m" Apr 16 10:10:24.059109 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:10:24.059087 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdfbb\" (UniqueName: \"kubernetes.io/projected/ae086d08-7560-49a5-92bd-8e4184cadf8e-kube-api-access-zdfbb\") pod \"cert-manager-cainjector-8966b78d4-l482m\" (UID: \"ae086d08-7560-49a5-92bd-8e4184cadf8e\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-l482m" Apr 16 10:10:24.172945 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:10:24.172860 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-l482m" Apr 16 10:10:24.290482 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:10:24.290452 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-l482m"] Apr 16 10:10:24.293595 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:10:24.293565 2525 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae086d08_7560_49a5_92bd_8e4184cadf8e.slice/crio-b4db0417e15dc2ffc2babc82472895942385f7987d4c37fa491ea4596580bb5a WatchSource:0}: Error finding container b4db0417e15dc2ffc2babc82472895942385f7987d4c37fa491ea4596580bb5a: Status 404 returned error can't find the container with id b4db0417e15dc2ffc2babc82472895942385f7987d4c37fa491ea4596580bb5a Apr 16 10:10:24.295710 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:10:24.295690 2525 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 10:10:25.053956 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:10:25.053925 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-l482m" event={"ID":"ae086d08-7560-49a5-92bd-8e4184cadf8e","Type":"ContainerStarted","Data":"b4db0417e15dc2ffc2babc82472895942385f7987d4c37fa491ea4596580bb5a"} Apr 16 10:10:28.067714 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:10:28.067670 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-l482m" event={"ID":"ae086d08-7560-49a5-92bd-8e4184cadf8e","Type":"ContainerStarted","Data":"c8754110fd6e62d63c84c11715d9d75b01c14a06effdaee376919cbeab022c21"} Apr 16 10:10:28.083079 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:10:28.083029 2525 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-l482m" podStartSLOduration=1.449512071 podStartE2EDuration="5.083013102s" podCreationTimestamp="2026-04-16 10:10:23 +0000 UTC" firstStartedPulling="2026-04-16 10:10:24.295851249 +0000 UTC m=+346.895299183" lastFinishedPulling="2026-04-16 10:10:27.929352275 +0000 UTC m=+350.528800214" observedRunningTime="2026-04-16 10:10:28.08114784 +0000 UTC m=+350.680595796" watchObservedRunningTime="2026-04-16 10:10:28.083013102 +0000 UTC m=+350.682461058" Apr 16 10:11:08.358609 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:11:08.358577 2525 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-5778cd48b5-2x7mz"] Apr 16 10:11:08.365873 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:11:08.365853 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-5778cd48b5-2x7mz" Apr 16 10:11:08.369310 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:11:08.368353 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"kube-root-ca.crt\"" Apr 16 10:11:08.369310 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:11:08.368602 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"jobset-manager-config\"" Apr 16 10:11:08.369310 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:11:08.369053 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"openshift-service-ca.crt\"" Apr 16 10:11:08.369310 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:11:08.369242 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"webhook-server-cert\"" Apr 16 10:11:08.369644 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:11:08.369536 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"metrics-server-cert\"" Apr 16 10:11:08.369644 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:11:08.369552 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-controller-manager-dockercfg-prh82\"" Apr 16 10:11:08.370767 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:11:08.370742 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-5778cd48b5-2x7mz"] Apr 16 10:11:08.523584 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:11:08.523553 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/15c47f47-5943-4380-83d5-1408e140b25b-metrics-certs\") pod \"jobset-controller-manager-5778cd48b5-2x7mz\" (UID: \"15c47f47-5943-4380-83d5-1408e140b25b\") " pod="openshift-jobset-operator/jobset-controller-manager-5778cd48b5-2x7mz" Apr 16 10:11:08.523584 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:11:08.523592 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15c47f47-5943-4380-83d5-1408e140b25b-cert\") pod \"jobset-controller-manager-5778cd48b5-2x7mz\" (UID: \"15c47f47-5943-4380-83d5-1408e140b25b\") " pod="openshift-jobset-operator/jobset-controller-manager-5778cd48b5-2x7mz" Apr 16 10:11:08.523806 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:11:08.523615 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/15c47f47-5943-4380-83d5-1408e140b25b-manager-config\") pod \"jobset-controller-manager-5778cd48b5-2x7mz\" (UID: \"15c47f47-5943-4380-83d5-1408e140b25b\") " pod="openshift-jobset-operator/jobset-controller-manager-5778cd48b5-2x7mz" Apr 16 10:11:08.523806 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:11:08.523670 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgwdh\" (UniqueName: \"kubernetes.io/projected/15c47f47-5943-4380-83d5-1408e140b25b-kube-api-access-hgwdh\") pod \"jobset-controller-manager-5778cd48b5-2x7mz\" (UID: \"15c47f47-5943-4380-83d5-1408e140b25b\") " pod="openshift-jobset-operator/jobset-controller-manager-5778cd48b5-2x7mz" Apr 16 10:11:08.624594 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:11:08.624477 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/15c47f47-5943-4380-83d5-1408e140b25b-metrics-certs\") pod \"jobset-controller-manager-5778cd48b5-2x7mz\" (UID: \"15c47f47-5943-4380-83d5-1408e140b25b\") " pod="openshift-jobset-operator/jobset-controller-manager-5778cd48b5-2x7mz" Apr 16 10:11:08.624594 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:11:08.624541 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15c47f47-5943-4380-83d5-1408e140b25b-cert\") pod \"jobset-controller-manager-5778cd48b5-2x7mz\" (UID: \"15c47f47-5943-4380-83d5-1408e140b25b\") " pod="openshift-jobset-operator/jobset-controller-manager-5778cd48b5-2x7mz" Apr 16 10:11:08.624814 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:11:08.624656 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/15c47f47-5943-4380-83d5-1408e140b25b-manager-config\") pod \"jobset-controller-manager-5778cd48b5-2x7mz\" (UID: \"15c47f47-5943-4380-83d5-1408e140b25b\") " pod="openshift-jobset-operator/jobset-controller-manager-5778cd48b5-2x7mz" Apr 16 10:11:08.624814 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:11:08.624695 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hgwdh\" (UniqueName: \"kubernetes.io/projected/15c47f47-5943-4380-83d5-1408e140b25b-kube-api-access-hgwdh\") pod \"jobset-controller-manager-5778cd48b5-2x7mz\" (UID: \"15c47f47-5943-4380-83d5-1408e140b25b\") " pod="openshift-jobset-operator/jobset-controller-manager-5778cd48b5-2x7mz" Apr 16 10:11:08.625235 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:11:08.625210 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/15c47f47-5943-4380-83d5-1408e140b25b-manager-config\") pod \"jobset-controller-manager-5778cd48b5-2x7mz\" (UID: \"15c47f47-5943-4380-83d5-1408e140b25b\") " pod="openshift-jobset-operator/jobset-controller-manager-5778cd48b5-2x7mz" Apr 16 10:11:08.627086 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:11:08.627064 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15c47f47-5943-4380-83d5-1408e140b25b-cert\") pod \"jobset-controller-manager-5778cd48b5-2x7mz\" (UID: \"15c47f47-5943-4380-83d5-1408e140b25b\") " pod="openshift-jobset-operator/jobset-controller-manager-5778cd48b5-2x7mz" Apr 16 10:11:08.627197 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:11:08.627116 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/15c47f47-5943-4380-83d5-1408e140b25b-metrics-certs\") pod \"jobset-controller-manager-5778cd48b5-2x7mz\" (UID: \"15c47f47-5943-4380-83d5-1408e140b25b\") " pod="openshift-jobset-operator/jobset-controller-manager-5778cd48b5-2x7mz" Apr 16 10:11:08.631918 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:11:08.631893 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgwdh\" (UniqueName: \"kubernetes.io/projected/15c47f47-5943-4380-83d5-1408e140b25b-kube-api-access-hgwdh\") pod \"jobset-controller-manager-5778cd48b5-2x7mz\" (UID: \"15c47f47-5943-4380-83d5-1408e140b25b\") " pod="openshift-jobset-operator/jobset-controller-manager-5778cd48b5-2x7mz" Apr 16 10:11:08.677987 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:11:08.677955 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-5778cd48b5-2x7mz" Apr 16 10:11:08.800012 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:11:08.799928 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-5778cd48b5-2x7mz"] Apr 16 10:11:08.803332 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:11:08.803304 2525 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15c47f47_5943_4380_83d5_1408e140b25b.slice/crio-cca944a2cfda202b9cd51c514979e595818ab0630fa15db647c342912710c5fc WatchSource:0}: Error finding container cca944a2cfda202b9cd51c514979e595818ab0630fa15db647c342912710c5fc: Status 404 returned error can't find the container with id cca944a2cfda202b9cd51c514979e595818ab0630fa15db647c342912710c5fc Apr 16 10:11:09.186641 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:11:09.186606 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-5778cd48b5-2x7mz" event={"ID":"15c47f47-5943-4380-83d5-1408e140b25b","Type":"ContainerStarted","Data":"cca944a2cfda202b9cd51c514979e595818ab0630fa15db647c342912710c5fc"} Apr 16 10:11:12.197471 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:11:12.197439 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-5778cd48b5-2x7mz" event={"ID":"15c47f47-5943-4380-83d5-1408e140b25b","Type":"ContainerStarted","Data":"3f92d379726acbf64328c02d825ecaeeb7a866e6891c88b6c63bbe31d5160b3f"} Apr 16 10:11:12.197862 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:11:12.197536 2525 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-jobset-operator/jobset-controller-manager-5778cd48b5-2x7mz" Apr 16 10:11:12.213712 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:11:12.213664 2525 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-controller-manager-5778cd48b5-2x7mz" podStartSLOduration=1.585216132 podStartE2EDuration="4.213650093s" podCreationTimestamp="2026-04-16 10:11:08 +0000 UTC" firstStartedPulling="2026-04-16 10:11:08.805065058 +0000 UTC m=+391.404512995" lastFinishedPulling="2026-04-16 10:11:11.433499012 +0000 UTC m=+394.032946956" observedRunningTime="2026-04-16 10:11:12.211337337 +0000 UTC m=+394.810785293" watchObservedRunningTime="2026-04-16 10:11:12.213650093 +0000 UTC m=+394.813098045" Apr 16 10:11:23.205913 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:11:23.205880 2525 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-jobset-operator/jobset-controller-manager-5778cd48b5-2x7mz" Apr 16 10:14:37.889855 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:14:37.889830 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-c2rf2_8603601a-c2cf-4413-9cb5-1801baafd774/console-operator/2.log" Apr 16 10:14:37.892211 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:14:37.892186 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-c2rf2_8603601a-c2cf-4413-9cb5-1801baafd774/console-operator/2.log" Apr 16 10:19:37.910654 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:19:37.910561 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-c2rf2_8603601a-c2cf-4413-9cb5-1801baafd774/console-operator/2.log" Apr 16 10:19:37.912658 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:19:37.912635 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-c2rf2_8603601a-c2cf-4413-9cb5-1801baafd774/console-operator/2.log" Apr 16 10:24:37.932337 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:24:37.932312 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-c2rf2_8603601a-c2cf-4413-9cb5-1801baafd774/console-operator/2.log" Apr 16 10:24:37.934808 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:24:37.934282 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-c2rf2_8603601a-c2cf-4413-9cb5-1801baafd774/console-operator/2.log" Apr 16 10:29:37.952013 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:29:37.951899 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-c2rf2_8603601a-c2cf-4413-9cb5-1801baafd774/console-operator/2.log" Apr 16 10:29:37.955805 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:29:37.955787 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-c2rf2_8603601a-c2cf-4413-9cb5-1801baafd774/console-operator/2.log" Apr 16 10:34:37.972893 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:34:37.972793 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-c2rf2_8603601a-c2cf-4413-9cb5-1801baafd774/console-operator/2.log" Apr 16 10:34:37.977928 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:34:37.977906 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-c2rf2_8603601a-c2cf-4413-9cb5-1801baafd774/console-operator/2.log" Apr 16 10:39:37.995065 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:39:37.994968 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-c2rf2_8603601a-c2cf-4413-9cb5-1801baafd774/console-operator/2.log" Apr 16 10:39:37.999489 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:39:37.999469 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-c2rf2_8603601a-c2cf-4413-9cb5-1801baafd774/console-operator/2.log" Apr 16 10:44:38.013654 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:44:38.013552 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-c2rf2_8603601a-c2cf-4413-9cb5-1801baafd774/console-operator/2.log" Apr 16 10:44:38.019222 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:44:38.019195 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-c2rf2_8603601a-c2cf-4413-9cb5-1801baafd774/console-operator/2.log" Apr 16 10:49:38.033535 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:49:38.033418 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-c2rf2_8603601a-c2cf-4413-9cb5-1801baafd774/console-operator/2.log" Apr 16 10:49:38.041355 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:49:38.041334 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-c2rf2_8603601a-c2cf-4413-9cb5-1801baafd774/console-operator/2.log" Apr 16 10:50:22.774817 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:22.774784 2525 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-28lhx/test-trainjob-4xc4f-dataset-initializer-0-0-4mckq"] Apr 16 10:50:22.777866 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:22.777850 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-28lhx/test-trainjob-4xc4f-dataset-initializer-0-0-4mckq" Apr 16 10:50:22.780746 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:22.780721 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-28lhx\"/\"openshift-service-ca.crt\"" Apr 16 10:50:22.780870 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:22.780832 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-28lhx\"/\"kube-root-ca.crt\"" Apr 16 10:50:22.781634 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:22.781617 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-28lhx\"/\"default-dockercfg-wbmph\"" Apr 16 10:50:22.812284 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:22.812266 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb9bp\" (UniqueName: \"kubernetes.io/projected/d2b0601f-371c-4e9f-96d5-22c1703882ac-kube-api-access-bb9bp\") pod \"test-trainjob-4xc4f-dataset-initializer-0-0-4mckq\" (UID: \"d2b0601f-371c-4e9f-96d5-22c1703882ac\") " pod="test-ns-28lhx/test-trainjob-4xc4f-dataset-initializer-0-0-4mckq" Apr 16 10:50:22.832704 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:22.832681 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-28lhx/test-trainjob-4xc4f-dataset-initializer-0-0-4mckq"] Apr 16 10:50:22.916882 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:22.916852 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bb9bp\" (UniqueName: \"kubernetes.io/projected/d2b0601f-371c-4e9f-96d5-22c1703882ac-kube-api-access-bb9bp\") pod \"test-trainjob-4xc4f-dataset-initializer-0-0-4mckq\" (UID: \"d2b0601f-371c-4e9f-96d5-22c1703882ac\") " pod="test-ns-28lhx/test-trainjob-4xc4f-dataset-initializer-0-0-4mckq" Apr 16 10:50:22.927699 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:22.927666 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb9bp\" (UniqueName: \"kubernetes.io/projected/d2b0601f-371c-4e9f-96d5-22c1703882ac-kube-api-access-bb9bp\") pod \"test-trainjob-4xc4f-dataset-initializer-0-0-4mckq\" (UID: \"d2b0601f-371c-4e9f-96d5-22c1703882ac\") " pod="test-ns-28lhx/test-trainjob-4xc4f-dataset-initializer-0-0-4mckq" Apr 16 10:50:32.390296 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:32.390257 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-bc8d9aae-8518-4c6b-8295-aee7ecba8306\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-089edf1f36fe3ef60\") pod \"test-trainjob-4xc4f-dataset-initializer-0-0-4mckq\" (UID: \"d2b0601f-371c-4e9f-96d5-22c1703882ac\") " pod="test-ns-28lhx/test-trainjob-4xc4f-dataset-initializer-0-0-4mckq" Apr 16 10:50:32.397089 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:32.397063 2525 operation_generator.go:1469] "Controller attach succeeded for volume \"pvc-bc8d9aae-8518-4c6b-8295-aee7ecba8306\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-089edf1f36fe3ef60\") pod \"test-trainjob-4xc4f-dataset-initializer-0-0-4mckq\" (UID: \"d2b0601f-371c-4e9f-96d5-22c1703882ac\") device path: \"\"" pod="test-ns-28lhx/test-trainjob-4xc4f-dataset-initializer-0-0-4mckq" Apr 16 10:50:32.491085 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:32.491044 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-bc8d9aae-8518-4c6b-8295-aee7ecba8306\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-089edf1f36fe3ef60\") pod \"test-trainjob-4xc4f-dataset-initializer-0-0-4mckq\" (UID: \"d2b0601f-371c-4e9f-96d5-22c1703882ac\") " pod="test-ns-28lhx/test-trainjob-4xc4f-dataset-initializer-0-0-4mckq" Apr 16 10:50:32.491266 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:32.491194 2525 operation_generator.go:515] "MountVolume.WaitForAttach entering for volume \"pvc-bc8d9aae-8518-4c6b-8295-aee7ecba8306\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-089edf1f36fe3ef60\") pod \"test-trainjob-4xc4f-dataset-initializer-0-0-4mckq\" (UID: \"d2b0601f-371c-4e9f-96d5-22c1703882ac\") DevicePath \"\"" pod="test-ns-28lhx/test-trainjob-4xc4f-dataset-initializer-0-0-4mckq" Apr 16 10:50:32.494425 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:32.494403 2525 operation_generator.go:525] "MountVolume.WaitForAttach succeeded for volume \"pvc-bc8d9aae-8518-4c6b-8295-aee7ecba8306\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-089edf1f36fe3ef60\") pod \"test-trainjob-4xc4f-dataset-initializer-0-0-4mckq\" (UID: \"d2b0601f-371c-4e9f-96d5-22c1703882ac\") DevicePath \"csi-cc63bc0088325ab8734a037471910399bf37dc254953f225966535691a1c0aa6\"" pod="test-ns-28lhx/test-trainjob-4xc4f-dataset-initializer-0-0-4mckq" Apr 16 10:50:32.679985 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:32.679942 2525 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"pvc-bc8d9aae-8518-4c6b-8295-aee7ecba8306\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-089edf1f36fe3ef60\") pod \"test-trainjob-4xc4f-dataset-initializer-0-0-4mckq\" (UID: \"d2b0601f-371c-4e9f-96d5-22c1703882ac\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/ebs.csi.aws.com/8937ea088fea58cb063953fb1ba1725a076f210879b1866822286697fdc5e4b1/globalmount\"" pod="test-ns-28lhx/test-trainjob-4xc4f-dataset-initializer-0-0-4mckq" Apr 16 10:50:32.694973 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:32.694943 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pvc-bc8d9aae-8518-4c6b-8295-aee7ecba8306\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-089edf1f36fe3ef60\") pod \"test-trainjob-4xc4f-dataset-initializer-0-0-4mckq\" (UID: \"d2b0601f-371c-4e9f-96d5-22c1703882ac\") " pod="test-ns-28lhx/test-trainjob-4xc4f-dataset-initializer-0-0-4mckq" Apr 16 10:50:32.988677 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:32.988583 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-28lhx/test-trainjob-4xc4f-dataset-initializer-0-0-4mckq" Apr 16 10:50:33.106994 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:33.106974 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-28lhx/test-trainjob-4xc4f-dataset-initializer-0-0-4mckq"] Apr 16 10:50:33.109363 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:50:33.109336 2525 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2b0601f_371c_4e9f_96d5_22c1703882ac.slice/crio-28e9634f85221a14e565495d02d6945883829c5d8ac22b3bb8ecf118b4fe8483 WatchSource:0}: Error finding container 28e9634f85221a14e565495d02d6945883829c5d8ac22b3bb8ecf118b4fe8483: Status 404 returned error can't find the container with id 28e9634f85221a14e565495d02d6945883829c5d8ac22b3bb8ecf118b4fe8483 Apr 16 10:50:33.111598 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:33.111581 2525 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 10:50:33.955062 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:33.955007 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-28lhx/test-trainjob-4xc4f-dataset-initializer-0-0-4mckq" event={"ID":"d2b0601f-371c-4e9f-96d5-22c1703882ac","Type":"ContainerStarted","Data":"28e9634f85221a14e565495d02d6945883829c5d8ac22b3bb8ecf118b4fe8483"} Apr 16 10:50:37.967459 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:37.967420 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-28lhx/test-trainjob-4xc4f-dataset-initializer-0-0-4mckq" event={"ID":"d2b0601f-371c-4e9f-96d5-22c1703882ac","Type":"ContainerStarted","Data":"94216366f9a0c83447e2d5f09cf518009e9732594d4f76f11e1f4e647232bd09"} Apr 16 10:50:42.982652 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:42.982598 2525 generic.go:358] "Generic (PLEG): container finished" podID="d2b0601f-371c-4e9f-96d5-22c1703882ac" containerID="94216366f9a0c83447e2d5f09cf518009e9732594d4f76f11e1f4e647232bd09" exitCode=0 Apr 16 10:50:42.983017 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:42.982675 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-28lhx/test-trainjob-4xc4f-dataset-initializer-0-0-4mckq" event={"ID":"d2b0601f-371c-4e9f-96d5-22c1703882ac","Type":"ContainerDied","Data":"94216366f9a0c83447e2d5f09cf518009e9732594d4f76f11e1f4e647232bd09"} Apr 16 10:50:44.109489 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:44.109466 2525 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-28lhx/test-trainjob-4xc4f-dataset-initializer-0-0-4mckq" Apr 16 10:50:44.196684 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:44.196658 2525 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bb9bp\" (UniqueName: \"kubernetes.io/projected/d2b0601f-371c-4e9f-96d5-22c1703882ac-kube-api-access-bb9bp\") pod \"d2b0601f-371c-4e9f-96d5-22c1703882ac\" (UID: \"d2b0601f-371c-4e9f-96d5-22c1703882ac\") " Apr 16 10:50:44.196859 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:44.196773 2525 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"workspace\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-089edf1f36fe3ef60\") pod \"d2b0601f-371c-4e9f-96d5-22c1703882ac\" (UID: \"d2b0601f-371c-4e9f-96d5-22c1703882ac\") " Apr 16 10:50:44.198874 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:44.198848 2525 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2b0601f-371c-4e9f-96d5-22c1703882ac-kube-api-access-bb9bp" (OuterVolumeSpecName: "kube-api-access-bb9bp") pod "d2b0601f-371c-4e9f-96d5-22c1703882ac" (UID: "d2b0601f-371c-4e9f-96d5-22c1703882ac"). InnerVolumeSpecName "kube-api-access-bb9bp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 10:50:44.199369 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:44.199345 2525 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/ebs.csi.aws.com^vol-089edf1f36fe3ef60" (OuterVolumeSpecName: "workspace") pod "d2b0601f-371c-4e9f-96d5-22c1703882ac" (UID: "d2b0601f-371c-4e9f-96d5-22c1703882ac"). InnerVolumeSpecName "pvc-bc8d9aae-8518-4c6b-8295-aee7ecba8306". PluginName "kubernetes.io/csi", VolumeGIDValue "" Apr 16 10:50:44.297853 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:44.297757 2525 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bb9bp\" (UniqueName: \"kubernetes.io/projected/d2b0601f-371c-4e9f-96d5-22c1703882ac-kube-api-access-bb9bp\") on node \"ip-10-0-135-1.ec2.internal\" DevicePath \"\"" Apr 16 10:50:44.297853 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:44.297824 2525 reconciler_common.go:292] "operationExecutor.UnmountDevice started for volume \"pvc-bc8d9aae-8518-4c6b-8295-aee7ecba8306\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-089edf1f36fe3ef60\") on node \"ip-10-0-135-1.ec2.internal\" " Apr 16 10:50:44.319259 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:44.319240 2525 operation_generator.go:895] UnmountDevice succeeded for volume "pvc-bc8d9aae-8518-4c6b-8295-aee7ecba8306" (UniqueName: "kubernetes.io/csi/ebs.csi.aws.com^vol-089edf1f36fe3ef60") on node "ip-10-0-135-1.ec2.internal" Apr 16 10:50:44.399144 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:44.399112 2525 reconciler_common.go:299] "Volume detached for volume \"pvc-bc8d9aae-8518-4c6b-8295-aee7ecba8306\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-089edf1f36fe3ef60\") on node \"ip-10-0-135-1.ec2.internal\" DevicePath \"csi-cc63bc0088325ab8734a037471910399bf37dc254953f225966535691a1c0aa6\"" Apr 16 10:50:44.989801 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:44.989764 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-28lhx/test-trainjob-4xc4f-dataset-initializer-0-0-4mckq" event={"ID":"d2b0601f-371c-4e9f-96d5-22c1703882ac","Type":"ContainerDied","Data":"28e9634f85221a14e565495d02d6945883829c5d8ac22b3bb8ecf118b4fe8483"} Apr 16 10:50:44.989801 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:44.989781 2525 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-28lhx/test-trainjob-4xc4f-dataset-initializer-0-0-4mckq" Apr 16 10:50:44.989801 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:44.989794 2525 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28e9634f85221a14e565495d02d6945883829c5d8ac22b3bb8ecf118b4fe8483" Apr 16 10:50:45.091873 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:45.091839 2525 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-28lhx/test-trainjob-4xc4f-model-initializer-0-0-n9tl6"] Apr 16 10:50:45.092146 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:45.092134 2525 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d2b0601f-371c-4e9f-96d5-22c1703882ac" containerName="dataset-initializer" Apr 16 10:50:45.092186 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:45.092148 2525 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2b0601f-371c-4e9f-96d5-22c1703882ac" containerName="dataset-initializer" Apr 16 10:50:45.092218 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:45.092211 2525 memory_manager.go:356] "RemoveStaleState removing state" podUID="d2b0601f-371c-4e9f-96d5-22c1703882ac" containerName="dataset-initializer" Apr 16 10:50:45.095319 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:45.095299 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-28lhx/test-trainjob-4xc4f-model-initializer-0-0-n9tl6" Apr 16 10:50:45.097854 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:45.097832 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-28lhx\"/\"default-dockercfg-wbmph\"" Apr 16 10:50:45.098090 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:45.098076 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-28lhx\"/\"kube-root-ca.crt\"" Apr 16 10:50:45.098412 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:45.098399 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-28lhx\"/\"openshift-service-ca.crt\"" Apr 16 10:50:45.109725 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:45.109700 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-28lhx/test-trainjob-4xc4f-model-initializer-0-0-n9tl6"] Apr 16 10:50:45.205364 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:45.205322 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr5z6\" (UniqueName: \"kubernetes.io/projected/70f1e922-1884-4d7e-98c8-3d4ea733133f-kube-api-access-dr5z6\") pod \"test-trainjob-4xc4f-model-initializer-0-0-n9tl6\" (UID: \"70f1e922-1884-4d7e-98c8-3d4ea733133f\") " pod="test-ns-28lhx/test-trainjob-4xc4f-model-initializer-0-0-n9tl6" Apr 16 10:50:45.306613 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:45.306571 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dr5z6\" (UniqueName: \"kubernetes.io/projected/70f1e922-1884-4d7e-98c8-3d4ea733133f-kube-api-access-dr5z6\") pod \"test-trainjob-4xc4f-model-initializer-0-0-n9tl6\" (UID: \"70f1e922-1884-4d7e-98c8-3d4ea733133f\") " pod="test-ns-28lhx/test-trainjob-4xc4f-model-initializer-0-0-n9tl6" Apr 16 10:50:45.315587 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:45.315564 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr5z6\" (UniqueName: \"kubernetes.io/projected/70f1e922-1884-4d7e-98c8-3d4ea733133f-kube-api-access-dr5z6\") pod \"test-trainjob-4xc4f-model-initializer-0-0-n9tl6\" (UID: \"70f1e922-1884-4d7e-98c8-3d4ea733133f\") " pod="test-ns-28lhx/test-trainjob-4xc4f-model-initializer-0-0-n9tl6" Apr 16 10:50:52.765260 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:52.765221 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-bc8d9aae-8518-4c6b-8295-aee7ecba8306\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-089edf1f36fe3ef60\") pod \"test-trainjob-4xc4f-model-initializer-0-0-n9tl6\" (UID: \"70f1e922-1884-4d7e-98c8-3d4ea733133f\") " pod="test-ns-28lhx/test-trainjob-4xc4f-model-initializer-0-0-n9tl6" Apr 16 10:50:52.772315 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:52.772287 2525 operation_generator.go:1469] "Controller attach succeeded for volume \"pvc-bc8d9aae-8518-4c6b-8295-aee7ecba8306\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-089edf1f36fe3ef60\") pod \"test-trainjob-4xc4f-model-initializer-0-0-n9tl6\" (UID: \"70f1e922-1884-4d7e-98c8-3d4ea733133f\") device path: \"\"" pod="test-ns-28lhx/test-trainjob-4xc4f-model-initializer-0-0-n9tl6" Apr 16 10:50:52.866611 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:52.866560 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-bc8d9aae-8518-4c6b-8295-aee7ecba8306\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-089edf1f36fe3ef60\") pod \"test-trainjob-4xc4f-model-initializer-0-0-n9tl6\" (UID: \"70f1e922-1884-4d7e-98c8-3d4ea733133f\") " pod="test-ns-28lhx/test-trainjob-4xc4f-model-initializer-0-0-n9tl6" Apr 16 10:50:52.866881 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:52.866712 2525 operation_generator.go:515] "MountVolume.WaitForAttach entering for volume \"pvc-bc8d9aae-8518-4c6b-8295-aee7ecba8306\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-089edf1f36fe3ef60\") pod \"test-trainjob-4xc4f-model-initializer-0-0-n9tl6\" (UID: \"70f1e922-1884-4d7e-98c8-3d4ea733133f\") DevicePath \"\"" pod="test-ns-28lhx/test-trainjob-4xc4f-model-initializer-0-0-n9tl6" Apr 16 10:50:52.869941 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:52.869923 2525 operation_generator.go:525] "MountVolume.WaitForAttach succeeded for volume \"pvc-bc8d9aae-8518-4c6b-8295-aee7ecba8306\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-089edf1f36fe3ef60\") pod \"test-trainjob-4xc4f-model-initializer-0-0-n9tl6\" (UID: \"70f1e922-1884-4d7e-98c8-3d4ea733133f\") DevicePath \"csi-cc63bc0088325ab8734a037471910399bf37dc254953f225966535691a1c0aa6\"" pod="test-ns-28lhx/test-trainjob-4xc4f-model-initializer-0-0-n9tl6" Apr 16 10:50:52.957002 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:52.956950 2525 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"pvc-bc8d9aae-8518-4c6b-8295-aee7ecba8306\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-089edf1f36fe3ef60\") pod \"test-trainjob-4xc4f-model-initializer-0-0-n9tl6\" (UID: \"70f1e922-1884-4d7e-98c8-3d4ea733133f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/ebs.csi.aws.com/8937ea088fea58cb063953fb1ba1725a076f210879b1866822286697fdc5e4b1/globalmount\"" pod="test-ns-28lhx/test-trainjob-4xc4f-model-initializer-0-0-n9tl6" Apr 16 10:50:52.972140 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:52.972104 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pvc-bc8d9aae-8518-4c6b-8295-aee7ecba8306\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-089edf1f36fe3ef60\") pod \"test-trainjob-4xc4f-model-initializer-0-0-n9tl6\" (UID: \"70f1e922-1884-4d7e-98c8-3d4ea733133f\") " pod="test-ns-28lhx/test-trainjob-4xc4f-model-initializer-0-0-n9tl6" Apr 16 10:50:53.205621 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:53.205594 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-28lhx/test-trainjob-4xc4f-model-initializer-0-0-n9tl6" Apr 16 10:50:53.325147 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:53.325125 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-28lhx/test-trainjob-4xc4f-model-initializer-0-0-n9tl6"] Apr 16 10:50:53.327375 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:50:53.327346 2525 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70f1e922_1884_4d7e_98c8_3d4ea733133f.slice/crio-918dce1cdef43f6e74bd89f365a4b3f4be9243c601d2251a1b82e0776d904f2f WatchSource:0}: Error finding container 918dce1cdef43f6e74bd89f365a4b3f4be9243c601d2251a1b82e0776d904f2f: Status 404 returned error can't find the container with id 918dce1cdef43f6e74bd89f365a4b3f4be9243c601d2251a1b82e0776d904f2f Apr 16 10:50:54.013311 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:50:54.013271 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-28lhx/test-trainjob-4xc4f-model-initializer-0-0-n9tl6" event={"ID":"70f1e922-1884-4d7e-98c8-3d4ea733133f","Type":"ContainerStarted","Data":"918dce1cdef43f6e74bd89f365a4b3f4be9243c601d2251a1b82e0776d904f2f"} Apr 16 10:52:11.258575 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:52:11.258537 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-28lhx/test-trainjob-4xc4f-model-initializer-0-0-n9tl6" event={"ID":"70f1e922-1884-4d7e-98c8-3d4ea733133f","Type":"ContainerStarted","Data":"35b35ffc59bd070016c33b1de2ae0e162387aaeb0d4e91ffc8f1257b7af95244"} Apr 16 10:52:11.275912 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:52:11.275862 2525 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-28lhx/test-trainjob-4xc4f-model-initializer-0-0-n9tl6" podStartSLOduration=9.304868445 podStartE2EDuration="1m26.275849723s" podCreationTimestamp="2026-04-16 10:50:45 +0000 UTC" firstStartedPulling="2026-04-16 10:50:53.329294526 +0000 UTC m=+2775.928742460" lastFinishedPulling="2026-04-16 10:52:10.300275802 +0000 UTC m=+2852.899723738" observedRunningTime="2026-04-16 10:52:11.274652451 +0000 UTC m=+2853.874100407" watchObservedRunningTime="2026-04-16 10:52:11.275849723 +0000 UTC m=+2853.875297678" Apr 16 10:52:22.292321 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:52:22.292286 2525 generic.go:358] "Generic (PLEG): container finished" podID="70f1e922-1884-4d7e-98c8-3d4ea733133f" containerID="35b35ffc59bd070016c33b1de2ae0e162387aaeb0d4e91ffc8f1257b7af95244" exitCode=0 Apr 16 10:52:22.292871 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:52:22.292345 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-28lhx/test-trainjob-4xc4f-model-initializer-0-0-n9tl6" event={"ID":"70f1e922-1884-4d7e-98c8-3d4ea733133f","Type":"ContainerDied","Data":"35b35ffc59bd070016c33b1de2ae0e162387aaeb0d4e91ffc8f1257b7af95244"} Apr 16 10:52:23.426847 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:52:23.426821 2525 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-28lhx/test-trainjob-4xc4f-model-initializer-0-0-n9tl6" Apr 16 10:52:23.529315 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:52:23.529277 2525 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dr5z6\" (UniqueName: \"kubernetes.io/projected/70f1e922-1884-4d7e-98c8-3d4ea733133f-kube-api-access-dr5z6\") pod \"70f1e922-1884-4d7e-98c8-3d4ea733133f\" (UID: \"70f1e922-1884-4d7e-98c8-3d4ea733133f\") " Apr 16 10:52:23.529484 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:52:23.529422 2525 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"workspace\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-089edf1f36fe3ef60\") pod \"70f1e922-1884-4d7e-98c8-3d4ea733133f\" (UID: \"70f1e922-1884-4d7e-98c8-3d4ea733133f\") " Apr 16 10:52:23.531486 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:52:23.531449 2525 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70f1e922-1884-4d7e-98c8-3d4ea733133f-kube-api-access-dr5z6" (OuterVolumeSpecName: "kube-api-access-dr5z6") pod "70f1e922-1884-4d7e-98c8-3d4ea733133f" (UID: "70f1e922-1884-4d7e-98c8-3d4ea733133f"). InnerVolumeSpecName "kube-api-access-dr5z6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 10:52:23.532031 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:52:23.532011 2525 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/ebs.csi.aws.com^vol-089edf1f36fe3ef60" (OuterVolumeSpecName: "workspace") pod "70f1e922-1884-4d7e-98c8-3d4ea733133f" (UID: "70f1e922-1884-4d7e-98c8-3d4ea733133f"). InnerVolumeSpecName "pvc-bc8d9aae-8518-4c6b-8295-aee7ecba8306". PluginName "kubernetes.io/csi", VolumeGIDValue "" Apr 16 10:52:23.630377 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:52:23.630326 2525 reconciler_common.go:292] "operationExecutor.UnmountDevice started for volume \"pvc-bc8d9aae-8518-4c6b-8295-aee7ecba8306\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-089edf1f36fe3ef60\") on node \"ip-10-0-135-1.ec2.internal\" " Apr 16 10:52:23.630377 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:52:23.630375 2525 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dr5z6\" (UniqueName: \"kubernetes.io/projected/70f1e922-1884-4d7e-98c8-3d4ea733133f-kube-api-access-dr5z6\") on node \"ip-10-0-135-1.ec2.internal\" DevicePath \"\"" Apr 16 10:52:23.691446 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:52:23.691419 2525 operation_generator.go:895] UnmountDevice succeeded for volume "pvc-bc8d9aae-8518-4c6b-8295-aee7ecba8306" (UniqueName: "kubernetes.io/csi/ebs.csi.aws.com^vol-089edf1f36fe3ef60") on node "ip-10-0-135-1.ec2.internal" Apr 16 10:52:23.731783 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:52:23.731751 2525 reconciler_common.go:299] "Volume detached for volume \"pvc-bc8d9aae-8518-4c6b-8295-aee7ecba8306\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-089edf1f36fe3ef60\") on node \"ip-10-0-135-1.ec2.internal\" DevicePath \"csi-cc63bc0088325ab8734a037471910399bf37dc254953f225966535691a1c0aa6\"" Apr 16 10:52:24.300122 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:52:24.300027 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-28lhx/test-trainjob-4xc4f-model-initializer-0-0-n9tl6" event={"ID":"70f1e922-1884-4d7e-98c8-3d4ea733133f","Type":"ContainerDied","Data":"918dce1cdef43f6e74bd89f365a4b3f4be9243c601d2251a1b82e0776d904f2f"} Apr 16 10:52:24.300122 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:52:24.300071 2525 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="918dce1cdef43f6e74bd89f365a4b3f4be9243c601d2251a1b82e0776d904f2f" Apr 16 10:52:24.300122 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:52:24.300052 2525 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-28lhx/test-trainjob-4xc4f-model-initializer-0-0-n9tl6" Apr 16 10:53:47.752316 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:47.752283 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/test-ns-28lhx_test-trainjob-4xc4f-dataset-initializer-0-0-4mckq_d2b0601f-371c-4e9f-96d5-22c1703882ac/dataset-initializer/0.log" Apr 16 10:53:47.758727 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:47.758705 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/test-ns-28lhx_test-trainjob-4xc4f-model-initializer-0-0-n9tl6_70f1e922-1884-4d7e-98c8-3d4ea733133f/model-initializer/0.log" Apr 16 10:53:50.988399 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:50.988361 2525 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-b74lg/test-trainjob-fail-5sz7g-dataset-initializer-0-0-kdzp6"] Apr 16 10:53:50.989610 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:50.989584 2525 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70f1e922-1884-4d7e-98c8-3d4ea733133f" containerName="model-initializer" Apr 16 10:53:50.989772 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:50.989761 2525 state_mem.go:107] "Deleted CPUSet assignment" podUID="70f1e922-1884-4d7e-98c8-3d4ea733133f" containerName="model-initializer" Apr 16 10:53:50.990040 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:50.990026 2525 memory_manager.go:356] "RemoveStaleState removing state" podUID="70f1e922-1884-4d7e-98c8-3d4ea733133f" containerName="model-initializer" Apr 16 10:53:50.993922 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:50.993899 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-b74lg/test-trainjob-fail-5sz7g-dataset-initializer-0-0-kdzp6" Apr 16 10:53:50.996569 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:50.996544 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-b74lg/test-trainjob-fail-5sz7g-dataset-initializer-0-0-kdzp6"] Apr 16 10:53:50.996712 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:50.996589 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-b74lg\"/\"default-dockercfg-mdsth\"" Apr 16 10:53:50.996875 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:50.996732 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-b74lg\"/\"openshift-service-ca.crt\"" Apr 16 10:53:50.996954 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:50.996862 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-b74lg\"/\"kube-root-ca.crt\"" Apr 16 10:53:51.046805 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:51.046763 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n48pp\" (UniqueName: \"kubernetes.io/projected/5ac6fd66-2c01-4df2-be2f-a20130827228-kube-api-access-n48pp\") pod \"test-trainjob-fail-5sz7g-dataset-initializer-0-0-kdzp6\" (UID: \"5ac6fd66-2c01-4df2-be2f-a20130827228\") " pod="test-ns-b74lg/test-trainjob-fail-5sz7g-dataset-initializer-0-0-kdzp6" Apr 16 10:53:51.147886 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:51.147845 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n48pp\" (UniqueName: \"kubernetes.io/projected/5ac6fd66-2c01-4df2-be2f-a20130827228-kube-api-access-n48pp\") pod \"test-trainjob-fail-5sz7g-dataset-initializer-0-0-kdzp6\" (UID: \"5ac6fd66-2c01-4df2-be2f-a20130827228\") " pod="test-ns-b74lg/test-trainjob-fail-5sz7g-dataset-initializer-0-0-kdzp6" Apr 16 10:53:51.156369 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:51.156344 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n48pp\" (UniqueName: \"kubernetes.io/projected/5ac6fd66-2c01-4df2-be2f-a20130827228-kube-api-access-n48pp\") pod \"test-trainjob-fail-5sz7g-dataset-initializer-0-0-kdzp6\" (UID: \"5ac6fd66-2c01-4df2-be2f-a20130827228\") " pod="test-ns-b74lg/test-trainjob-fail-5sz7g-dataset-initializer-0-0-kdzp6" Apr 16 10:53:52.807358 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:52.807320 2525 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-28lhx/test-trainjob-4xc4f-dataset-initializer-0-0-4mckq"] Apr 16 10:53:52.811504 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:52.811476 2525 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-28lhx/test-trainjob-4xc4f-dataset-initializer-0-0-4mckq"] Apr 16 10:53:52.819257 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:52.819236 2525 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-28lhx/test-trainjob-4xc4f-model-initializer-0-0-n9tl6"] Apr 16 10:53:52.823212 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:52.823192 2525 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-28lhx/test-trainjob-4xc4f-model-initializer-0-0-n9tl6"] Apr 16 10:53:53.992771 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:53.992741 2525 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70f1e922-1884-4d7e-98c8-3d4ea733133f" path="/var/lib/kubelet/pods/70f1e922-1884-4d7e-98c8-3d4ea733133f/volumes" Apr 16 10:53:53.993222 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:53.993086 2525 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2b0601f-371c-4e9f-96d5-22c1703882ac" path="/var/lib/kubelet/pods/d2b0601f-371c-4e9f-96d5-22c1703882ac/volumes" Apr 16 10:53:57.199684 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:57.199603 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-05b62023-4032-4fa3-8168-76e84088cdfa\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-023b6d3bf07de7e8b\") pod \"test-trainjob-fail-5sz7g-dataset-initializer-0-0-kdzp6\" (UID: \"5ac6fd66-2c01-4df2-be2f-a20130827228\") " pod="test-ns-b74lg/test-trainjob-fail-5sz7g-dataset-initializer-0-0-kdzp6" Apr 16 10:53:57.209494 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:57.204910 2525 operation_generator.go:1469] "Controller attach succeeded for volume \"pvc-05b62023-4032-4fa3-8168-76e84088cdfa\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-023b6d3bf07de7e8b\") pod \"test-trainjob-fail-5sz7g-dataset-initializer-0-0-kdzp6\" (UID: \"5ac6fd66-2c01-4df2-be2f-a20130827228\") device path: \"\"" pod="test-ns-b74lg/test-trainjob-fail-5sz7g-dataset-initializer-0-0-kdzp6" Apr 16 10:53:57.300293 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:57.300254 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-05b62023-4032-4fa3-8168-76e84088cdfa\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-023b6d3bf07de7e8b\") pod \"test-trainjob-fail-5sz7g-dataset-initializer-0-0-kdzp6\" (UID: \"5ac6fd66-2c01-4df2-be2f-a20130827228\") " pod="test-ns-b74lg/test-trainjob-fail-5sz7g-dataset-initializer-0-0-kdzp6" Apr 16 10:53:57.300438 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:57.300386 2525 operation_generator.go:515] "MountVolume.WaitForAttach entering for volume \"pvc-05b62023-4032-4fa3-8168-76e84088cdfa\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-023b6d3bf07de7e8b\") pod \"test-trainjob-fail-5sz7g-dataset-initializer-0-0-kdzp6\" (UID: \"5ac6fd66-2c01-4df2-be2f-a20130827228\") DevicePath \"\"" pod="test-ns-b74lg/test-trainjob-fail-5sz7g-dataset-initializer-0-0-kdzp6" Apr 16 10:53:57.303812 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:57.303794 2525 operation_generator.go:525] "MountVolume.WaitForAttach succeeded for volume \"pvc-05b62023-4032-4fa3-8168-76e84088cdfa\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-023b6d3bf07de7e8b\") pod \"test-trainjob-fail-5sz7g-dataset-initializer-0-0-kdzp6\" (UID: \"5ac6fd66-2c01-4df2-be2f-a20130827228\") DevicePath \"csi-50d3716e115d506af115787ba2f2f426e0b3d39067ffccb06a3464d0b2ac248a\"" pod="test-ns-b74lg/test-trainjob-fail-5sz7g-dataset-initializer-0-0-kdzp6" Apr 16 10:53:57.467163 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:57.467091 2525 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"pvc-05b62023-4032-4fa3-8168-76e84088cdfa\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-023b6d3bf07de7e8b\") pod \"test-trainjob-fail-5sz7g-dataset-initializer-0-0-kdzp6\" (UID: \"5ac6fd66-2c01-4df2-be2f-a20130827228\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/ebs.csi.aws.com/fbc1b57d10e6520a1e2702050f8d887135a052fe1fc18200aa09588df0249692/globalmount\"" pod="test-ns-b74lg/test-trainjob-fail-5sz7g-dataset-initializer-0-0-kdzp6" Apr 16 10:53:57.480303 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:57.480279 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pvc-05b62023-4032-4fa3-8168-76e84088cdfa\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-023b6d3bf07de7e8b\") pod \"test-trainjob-fail-5sz7g-dataset-initializer-0-0-kdzp6\" (UID: \"5ac6fd66-2c01-4df2-be2f-a20130827228\") " pod="test-ns-b74lg/test-trainjob-fail-5sz7g-dataset-initializer-0-0-kdzp6" Apr 16 10:53:57.605378 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:57.605347 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-b74lg/test-trainjob-fail-5sz7g-dataset-initializer-0-0-kdzp6" Apr 16 10:53:57.730159 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:57.730131 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-b74lg/test-trainjob-fail-5sz7g-dataset-initializer-0-0-kdzp6"] Apr 16 10:53:57.732551 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:53:57.732519 2525 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ac6fd66_2c01_4df2_be2f_a20130827228.slice/crio-1677752673a5cb7167f491d08878b0dd66c27d525e30ee6106a2cabd22ccab10 WatchSource:0}: Error finding container 1677752673a5cb7167f491d08878b0dd66c27d525e30ee6106a2cabd22ccab10: Status 404 returned error can't find the container with id 1677752673a5cb7167f491d08878b0dd66c27d525e30ee6106a2cabd22ccab10 Apr 16 10:53:58.574945 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:58.574911 2525 generic.go:358] "Generic (PLEG): container finished" podID="5ac6fd66-2c01-4df2-be2f-a20130827228" containerID="b53a2a2a8be65d1a9424e18e4dfbe7b1d610ad92edd833c2fc267cd1c5bcd84f" exitCode=1 Apr 16 10:53:58.575328 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:58.574981 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-b74lg/test-trainjob-fail-5sz7g-dataset-initializer-0-0-kdzp6" event={"ID":"5ac6fd66-2c01-4df2-be2f-a20130827228","Type":"ContainerDied","Data":"b53a2a2a8be65d1a9424e18e4dfbe7b1d610ad92edd833c2fc267cd1c5bcd84f"} Apr 16 10:53:58.575328 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:58.575003 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-b74lg/test-trainjob-fail-5sz7g-dataset-initializer-0-0-kdzp6" event={"ID":"5ac6fd66-2c01-4df2-be2f-a20130827228","Type":"ContainerStarted","Data":"1677752673a5cb7167f491d08878b0dd66c27d525e30ee6106a2cabd22ccab10"} Apr 16 10:53:59.697426 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:59.697397 2525 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-b74lg/test-trainjob-fail-5sz7g-dataset-initializer-0-0-kdzp6" Apr 16 10:53:59.819939 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:59.819906 2525 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"workspace\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-023b6d3bf07de7e8b\") pod \"5ac6fd66-2c01-4df2-be2f-a20130827228\" (UID: \"5ac6fd66-2c01-4df2-be2f-a20130827228\") " Apr 16 10:53:59.820131 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:59.820018 2525 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n48pp\" (UniqueName: \"kubernetes.io/projected/5ac6fd66-2c01-4df2-be2f-a20130827228-kube-api-access-n48pp\") pod \"5ac6fd66-2c01-4df2-be2f-a20130827228\" (UID: \"5ac6fd66-2c01-4df2-be2f-a20130827228\") " Apr 16 10:53:59.822418 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:59.822386 2525 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ac6fd66-2c01-4df2-be2f-a20130827228-kube-api-access-n48pp" (OuterVolumeSpecName: "kube-api-access-n48pp") pod "5ac6fd66-2c01-4df2-be2f-a20130827228" (UID: "5ac6fd66-2c01-4df2-be2f-a20130827228"). InnerVolumeSpecName "kube-api-access-n48pp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 10:53:59.822591 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:59.822568 2525 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/ebs.csi.aws.com^vol-023b6d3bf07de7e8b" (OuterVolumeSpecName: "workspace") pod "5ac6fd66-2c01-4df2-be2f-a20130827228" (UID: "5ac6fd66-2c01-4df2-be2f-a20130827228"). InnerVolumeSpecName "pvc-05b62023-4032-4fa3-8168-76e84088cdfa". PluginName "kubernetes.io/csi", VolumeGIDValue "" Apr 16 10:53:59.921199 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:59.921113 2525 reconciler_common.go:292] "operationExecutor.UnmountDevice started for volume \"pvc-05b62023-4032-4fa3-8168-76e84088cdfa\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-023b6d3bf07de7e8b\") on node \"ip-10-0-135-1.ec2.internal\" " Apr 16 10:53:59.921199 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:59.921143 2525 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n48pp\" (UniqueName: \"kubernetes.io/projected/5ac6fd66-2c01-4df2-be2f-a20130827228-kube-api-access-n48pp\") on node \"ip-10-0-135-1.ec2.internal\" DevicePath \"\"" Apr 16 10:53:59.934310 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:53:59.934292 2525 operation_generator.go:895] UnmountDevice succeeded for volume "pvc-05b62023-4032-4fa3-8168-76e84088cdfa" (UniqueName: "kubernetes.io/csi/ebs.csi.aws.com^vol-023b6d3bf07de7e8b") on node "ip-10-0-135-1.ec2.internal" Apr 16 10:54:00.022437 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:54:00.022408 2525 reconciler_common.go:299] "Volume detached for volume \"pvc-05b62023-4032-4fa3-8168-76e84088cdfa\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-023b6d3bf07de7e8b\") on node \"ip-10-0-135-1.ec2.internal\" DevicePath \"csi-50d3716e115d506af115787ba2f2f426e0b3d39067ffccb06a3464d0b2ac248a\"" Apr 16 10:54:00.583802 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:54:00.583771 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-b74lg/test-trainjob-fail-5sz7g-dataset-initializer-0-0-kdzp6" event={"ID":"5ac6fd66-2c01-4df2-be2f-a20130827228","Type":"ContainerDied","Data":"1677752673a5cb7167f491d08878b0dd66c27d525e30ee6106a2cabd22ccab10"} Apr 16 10:54:00.583802 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:54:00.583804 2525 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1677752673a5cb7167f491d08878b0dd66c27d525e30ee6106a2cabd22ccab10" Apr 16 10:54:00.584006 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:54:00.583808 2525 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-b74lg/test-trainjob-fail-5sz7g-dataset-initializer-0-0-kdzp6" Apr 16 10:54:00.948580 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:54:00.948451 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/test-ns-b74lg_test-trainjob-fail-5sz7g-dataset-initializer-0-0-kdzp6_5ac6fd66-2c01-4df2-be2f-a20130827228/dataset-initializer/0.log" Apr 16 10:54:05.978725 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:54:05.978684 2525 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-b74lg/test-trainjob-fail-5sz7g-dataset-initializer-0-0-kdzp6"] Apr 16 10:54:05.982014 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:54:05.981983 2525 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-b74lg/test-trainjob-fail-5sz7g-dataset-initializer-0-0-kdzp6"] Apr 16 10:54:05.992378 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:54:05.992350 2525 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ac6fd66-2c01-4df2-be2f-a20130827228" path="/var/lib/kubelet/pods/5ac6fd66-2c01-4df2-be2f-a20130827228/volumes" Apr 16 10:54:38.057949 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:54:38.057921 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-c2rf2_8603601a-c2cf-4413-9cb5-1801baafd774/console-operator/2.log" Apr 16 10:54:38.062949 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:54:38.062926 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-c2rf2_8603601a-c2cf-4413-9cb5-1801baafd774/console-operator/2.log" Apr 16 10:55:01.916776 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:01.916747 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-lkz6h_38ea256d-b16d-4ed9-9a51-5a9007157783/global-pull-secret-syncer/0.log" Apr 16 10:55:02.049797 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:02.049770 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-pwsbd_34e490e3-17d2-4fa4-ba1d-76247d80e91c/konnectivity-agent/0.log" Apr 16 10:55:02.133000 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:02.132978 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-1.ec2.internal_b1bfdb1b2e8b28ab395d30767607393b/haproxy/0.log" Apr 16 10:55:05.543116 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:05.543038 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-j6qrg_c26fcd6b-d7f0-4fc8-809f-192c8b9eb35c/cluster-monitoring-operator/0.log" Apr 16 10:55:05.737859 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:05.737831 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-688967fbc4-7cgfh_c2f4245b-c7fb-4515-bae2-db6271349435/metrics-server/0.log" Apr 16 10:55:05.841844 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:05.841808 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-67ww5_68dcc22a-a6f3-46f9-a654-7e6b830011c3/node-exporter/0.log" Apr 16 10:55:05.885676 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:05.885645 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-67ww5_68dcc22a-a6f3-46f9-a654-7e6b830011c3/kube-rbac-proxy/0.log" Apr 16 10:55:05.919679 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:05.919653 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-67ww5_68dcc22a-a6f3-46f9-a654-7e6b830011c3/init-textfile/0.log" Apr 16 10:55:06.413443 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:06.413414 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_75e4c1e8-fa1c-41fb-80d5-35856f420384/prometheus/0.log" Apr 16 10:55:06.468861 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:06.468826 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_75e4c1e8-fa1c-41fb-80d5-35856f420384/config-reloader/0.log" Apr 16 10:55:06.501178 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:06.501139 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_75e4c1e8-fa1c-41fb-80d5-35856f420384/thanos-sidecar/0.log" Apr 16 10:55:06.536886 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:06.536805 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_75e4c1e8-fa1c-41fb-80d5-35856f420384/kube-rbac-proxy-web/0.log" Apr 16 10:55:06.598057 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:06.598030 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_75e4c1e8-fa1c-41fb-80d5-35856f420384/kube-rbac-proxy/0.log" Apr 16 10:55:06.658088 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:06.658063 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_75e4c1e8-fa1c-41fb-80d5-35856f420384/kube-rbac-proxy-thanos/0.log" Apr 16 10:55:06.715766 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:06.715738 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_75e4c1e8-fa1c-41fb-80d5-35856f420384/init-config-reloader/0.log" Apr 16 10:55:06.780947 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:06.780919 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-6xll6_e8c5484b-2c69-48a7-aff1-28b3f4709e90/prometheus-operator/0.log" Apr 16 10:55:06.829923 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:06.829897 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-6xll6_e8c5484b-2c69-48a7-aff1-28b3f4709e90/kube-rbac-proxy/0.log" Apr 16 10:55:07.033482 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:07.033453 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-8ddddd4-lc5g8_2deb8c49-4b7e-40e9-b0a9-a6db1e950268/thanos-query/0.log" Apr 16 10:55:07.062479 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:07.062447 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-8ddddd4-lc5g8_2deb8c49-4b7e-40e9-b0a9-a6db1e950268/kube-rbac-proxy-web/0.log" Apr 16 10:55:07.089012 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:07.088944 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-8ddddd4-lc5g8_2deb8c49-4b7e-40e9-b0a9-a6db1e950268/kube-rbac-proxy/0.log" Apr 16 10:55:07.113012 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:07.112990 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-8ddddd4-lc5g8_2deb8c49-4b7e-40e9-b0a9-a6db1e950268/prom-label-proxy/0.log" Apr 16 10:55:07.141393 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:07.141372 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-8ddddd4-lc5g8_2deb8c49-4b7e-40e9-b0a9-a6db1e950268/kube-rbac-proxy-rules/0.log" Apr 16 10:55:07.165656 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:07.165630 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-8ddddd4-lc5g8_2deb8c49-4b7e-40e9-b0a9-a6db1e950268/kube-rbac-proxy-metrics/0.log" Apr 16 10:55:08.191411 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:08.191368 2525 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d9gfw/perf-node-gather-daemonset-grpkl"] Apr 16 10:55:08.191872 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:08.191759 2525 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ac6fd66-2c01-4df2-be2f-a20130827228" containerName="dataset-initializer" Apr 16 10:55:08.191872 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:08.191774 2525 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ac6fd66-2c01-4df2-be2f-a20130827228" containerName="dataset-initializer" Apr 16 10:55:08.191872 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:08.191821 2525 memory_manager.go:356] "RemoveStaleState removing state" podUID="5ac6fd66-2c01-4df2-be2f-a20130827228" containerName="dataset-initializer" Apr 16 10:55:08.194864 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:08.194843 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-grpkl" Apr 16 10:55:08.197222 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:08.197202 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-d9gfw\"/\"kube-root-ca.crt\"" Apr 16 10:55:08.198125 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:08.198100 2525 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-d9gfw\"/\"default-dockercfg-glr59\"" Apr 16 10:55:08.198242 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:08.198138 2525 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-d9gfw\"/\"openshift-service-ca.crt\"" Apr 16 10:55:08.205312 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:08.205290 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d9gfw/perf-node-gather-daemonset-grpkl"] Apr 16 10:55:08.289821 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:08.289785 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/911f4f2f-27fa-4ccf-b9e5-7ea95b97dfbc-proc\") pod \"perf-node-gather-daemonset-grpkl\" (UID: \"911f4f2f-27fa-4ccf-b9e5-7ea95b97dfbc\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-grpkl" Apr 16 10:55:08.290019 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:08.289851 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fsgq\" (UniqueName: \"kubernetes.io/projected/911f4f2f-27fa-4ccf-b9e5-7ea95b97dfbc-kube-api-access-8fsgq\") pod \"perf-node-gather-daemonset-grpkl\" (UID: \"911f4f2f-27fa-4ccf-b9e5-7ea95b97dfbc\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-grpkl" Apr 16 10:55:08.290019 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:08.289875 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/911f4f2f-27fa-4ccf-b9e5-7ea95b97dfbc-sys\") pod \"perf-node-gather-daemonset-grpkl\" (UID: \"911f4f2f-27fa-4ccf-b9e5-7ea95b97dfbc\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-grpkl" Apr 16 10:55:08.290019 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:08.289890 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/911f4f2f-27fa-4ccf-b9e5-7ea95b97dfbc-lib-modules\") pod \"perf-node-gather-daemonset-grpkl\" (UID: \"911f4f2f-27fa-4ccf-b9e5-7ea95b97dfbc\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-grpkl" Apr 16 10:55:08.290019 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:08.289916 2525 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/911f4f2f-27fa-4ccf-b9e5-7ea95b97dfbc-podres\") pod \"perf-node-gather-daemonset-grpkl\" (UID: \"911f4f2f-27fa-4ccf-b9e5-7ea95b97dfbc\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-grpkl" Apr 16 10:55:08.391315 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:08.391279 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8fsgq\" (UniqueName: \"kubernetes.io/projected/911f4f2f-27fa-4ccf-b9e5-7ea95b97dfbc-kube-api-access-8fsgq\") pod \"perf-node-gather-daemonset-grpkl\" (UID: \"911f4f2f-27fa-4ccf-b9e5-7ea95b97dfbc\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-grpkl" Apr 16 10:55:08.391506 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:08.391325 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/911f4f2f-27fa-4ccf-b9e5-7ea95b97dfbc-sys\") pod \"perf-node-gather-daemonset-grpkl\" (UID: \"911f4f2f-27fa-4ccf-b9e5-7ea95b97dfbc\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-grpkl" Apr 16 10:55:08.391506 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:08.391349 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/911f4f2f-27fa-4ccf-b9e5-7ea95b97dfbc-lib-modules\") pod \"perf-node-gather-daemonset-grpkl\" (UID: \"911f4f2f-27fa-4ccf-b9e5-7ea95b97dfbc\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-grpkl" Apr 16 10:55:08.391506 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:08.391384 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/911f4f2f-27fa-4ccf-b9e5-7ea95b97dfbc-podres\") pod \"perf-node-gather-daemonset-grpkl\" (UID: \"911f4f2f-27fa-4ccf-b9e5-7ea95b97dfbc\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-grpkl" Apr 16 10:55:08.391506 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:08.391425 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/911f4f2f-27fa-4ccf-b9e5-7ea95b97dfbc-sys\") pod \"perf-node-gather-daemonset-grpkl\" (UID: \"911f4f2f-27fa-4ccf-b9e5-7ea95b97dfbc\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-grpkl" Apr 16 10:55:08.391506 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:08.391433 2525 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/911f4f2f-27fa-4ccf-b9e5-7ea95b97dfbc-proc\") pod \"perf-node-gather-daemonset-grpkl\" (UID: \"911f4f2f-27fa-4ccf-b9e5-7ea95b97dfbc\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-grpkl" Apr 16 10:55:08.391732 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:08.391543 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/911f4f2f-27fa-4ccf-b9e5-7ea95b97dfbc-proc\") pod \"perf-node-gather-daemonset-grpkl\" (UID: \"911f4f2f-27fa-4ccf-b9e5-7ea95b97dfbc\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-grpkl" Apr 16 10:55:08.391732 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:08.391559 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/911f4f2f-27fa-4ccf-b9e5-7ea95b97dfbc-podres\") pod \"perf-node-gather-daemonset-grpkl\" (UID: \"911f4f2f-27fa-4ccf-b9e5-7ea95b97dfbc\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-grpkl" Apr 16 10:55:08.391732 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:08.391563 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/911f4f2f-27fa-4ccf-b9e5-7ea95b97dfbc-lib-modules\") pod \"perf-node-gather-daemonset-grpkl\" (UID: \"911f4f2f-27fa-4ccf-b9e5-7ea95b97dfbc\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-grpkl" Apr 16 10:55:08.400622 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:08.400595 2525 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fsgq\" (UniqueName: \"kubernetes.io/projected/911f4f2f-27fa-4ccf-b9e5-7ea95b97dfbc-kube-api-access-8fsgq\") pod \"perf-node-gather-daemonset-grpkl\" (UID: \"911f4f2f-27fa-4ccf-b9e5-7ea95b97dfbc\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-grpkl" Apr 16 10:55:08.471989 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:08.471897 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-c2rf2_8603601a-c2cf-4413-9cb5-1801baafd774/console-operator/2.log" Apr 16 10:55:08.476455 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:08.476433 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-c2rf2_8603601a-c2cf-4413-9cb5-1801baafd774/console-operator/3.log" Apr 16 10:55:08.505185 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:08.505152 2525 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-grpkl" Apr 16 10:55:08.628541 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:08.628402 2525 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d9gfw/perf-node-gather-daemonset-grpkl"] Apr 16 10:55:08.631256 ip-10-0-135-1 kubenswrapper[2525]: W0416 10:55:08.631226 2525 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod911f4f2f_27fa_4ccf_b9e5_7ea95b97dfbc.slice/crio-d70787ef75a9bcbf41bfedc321fd3f6462cca92cd2dee706162ec7d454dd240c WatchSource:0}: Error finding container d70787ef75a9bcbf41bfedc321fd3f6462cca92cd2dee706162ec7d454dd240c: Status 404 returned error can't find the container with id d70787ef75a9bcbf41bfedc321fd3f6462cca92cd2dee706162ec7d454dd240c Apr 16 10:55:08.805870 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:08.805829 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-grpkl" event={"ID":"911f4f2f-27fa-4ccf-b9e5-7ea95b97dfbc","Type":"ContainerStarted","Data":"78e9d41b3e14f406b1a1d153db290fb7fc748489a8134d04e5274dfe2dd30b01"} Apr 16 10:55:08.805870 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:08.805869 2525 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-grpkl" event={"ID":"911f4f2f-27fa-4ccf-b9e5-7ea95b97dfbc","Type":"ContainerStarted","Data":"d70787ef75a9bcbf41bfedc321fd3f6462cca92cd2dee706162ec7d454dd240c"} Apr 16 10:55:08.806228 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:08.805902 2525 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-grpkl" Apr 16 10:55:08.823869 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:08.823810 2525 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-grpkl" podStartSLOduration=0.823790702 podStartE2EDuration="823.790702ms" podCreationTimestamp="2026-04-16 10:55:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 10:55:08.823142344 +0000 UTC m=+3031.422590323" watchObservedRunningTime="2026-04-16 10:55:08.823790702 +0000 UTC m=+3031.423238658" Apr 16 10:55:08.903321 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:08.903283 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-gbvlr_feca609b-cecb-458c-99d4-6f8e74e5ca33/download-server/0.log" Apr 16 10:55:09.991251 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:09.991223 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-sh9pl_86d6efca-ec4a-40d2-a200-6d8dacba5368/dns/0.log" Apr 16 10:55:10.017452 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:10.017425 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-sh9pl_86d6efca-ec4a-40d2-a200-6d8dacba5368/kube-rbac-proxy/0.log" Apr 16 10:55:10.109016 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:10.108983 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-scbdk_9c0ede08-fd0f-4aba-9ec1-e44fb8d11bd4/dns-node-resolver/0.log" Apr 16 10:55:10.557215 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:10.557185 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-hhqrb_3912faf1-da9e-41de-8d4c-b2cdb354f252/node-ca/0.log" Apr 16 10:55:11.726561 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:11.726531 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-xnnjq_4a03f4b8-27ee-4dd2-8150-2b73a73e4f06/serve-healthcheck-canary/0.log" Apr 16 10:55:12.132290 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:12.132264 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2b85g_1888e23b-9237-4f13-9a55-ad7aa5434c5d/kube-rbac-proxy/0.log" Apr 16 10:55:12.155331 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:12.155304 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2b85g_1888e23b-9237-4f13-9a55-ad7aa5434c5d/exporter/0.log" Apr 16 10:55:12.177694 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:12.177666 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2b85g_1888e23b-9237-4f13-9a55-ad7aa5434c5d/extractor/0.log" Apr 16 10:55:14.030597 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:14.030556 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-controller-manager-5778cd48b5-2x7mz_15c47f47-5943-4380-83d5-1408e140b25b/manager/0.log" Apr 16 10:55:14.818979 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:14.818954 2525 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-grpkl" Apr 16 10:55:17.213579 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:17.213551 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-6t6gn_63ea1f8d-2d65-47e6-84cc-63d854cc64de/migrator/0.log" Apr 16 10:55:17.237676 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:17.237649 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-6t6gn_63ea1f8d-2d65-47e6-84cc-63d854cc64de/graceful-termination/0.log" Apr 16 10:55:17.595680 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:17.595650 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-vnkxh_e96bf9a9-c840-4751-89c6-13968641abc6/kube-storage-version-migrator-operator/1.log" Apr 16 10:55:17.596411 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:17.596392 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-vnkxh_e96bf9a9-c840-4751-89c6-13968641abc6/kube-storage-version-migrator-operator/0.log" Apr 16 10:55:18.482443 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:18.482363 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fpw2w_dc3554db-b5f4-43be-8180-e48573e27cb6/kube-multus-additional-cni-plugins/0.log" Apr 16 10:55:18.505123 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:18.505097 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fpw2w_dc3554db-b5f4-43be-8180-e48573e27cb6/egress-router-binary-copy/0.log" Apr 16 10:55:18.527336 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:18.527313 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fpw2w_dc3554db-b5f4-43be-8180-e48573e27cb6/cni-plugins/0.log" Apr 16 10:55:18.547013 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:18.546986 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fpw2w_dc3554db-b5f4-43be-8180-e48573e27cb6/bond-cni-plugin/0.log" Apr 16 10:55:18.575253 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:18.575227 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fpw2w_dc3554db-b5f4-43be-8180-e48573e27cb6/routeoverride-cni/0.log" Apr 16 10:55:18.598566 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:18.598538 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fpw2w_dc3554db-b5f4-43be-8180-e48573e27cb6/whereabouts-cni-bincopy/0.log" Apr 16 10:55:18.625362 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:18.625338 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fpw2w_dc3554db-b5f4-43be-8180-e48573e27cb6/whereabouts-cni/0.log" Apr 16 10:55:19.118390 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:19.118355 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pmkq5_27405ca4-ee70-4403-a147-625dbadaa808/kube-multus/0.log" Apr 16 10:55:19.300644 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:19.300613 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-mczdx_4739af65-2ca2-4e8f-9b0e-ddeac76a9b66/network-metrics-daemon/0.log" Apr 16 10:55:19.336959 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:19.336891 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-mczdx_4739af65-2ca2-4e8f-9b0e-ddeac76a9b66/kube-rbac-proxy/0.log" Apr 16 10:55:20.402686 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:20.402626 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rk78v_2bdec43e-45b2-4aa1-8605-fa162a8fc58b/ovn-controller/0.log" Apr 16 10:55:20.444495 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:20.444464 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rk78v_2bdec43e-45b2-4aa1-8605-fa162a8fc58b/ovn-acl-logging/0.log" Apr 16 10:55:20.465570 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:20.465540 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rk78v_2bdec43e-45b2-4aa1-8605-fa162a8fc58b/kube-rbac-proxy-node/0.log" Apr 16 10:55:20.492424 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:20.492395 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rk78v_2bdec43e-45b2-4aa1-8605-fa162a8fc58b/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 10:55:20.512968 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:20.512939 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rk78v_2bdec43e-45b2-4aa1-8605-fa162a8fc58b/northd/0.log" Apr 16 10:55:20.540019 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:20.539990 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rk78v_2bdec43e-45b2-4aa1-8605-fa162a8fc58b/nbdb/0.log" Apr 16 10:55:20.565043 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:20.565016 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rk78v_2bdec43e-45b2-4aa1-8605-fa162a8fc58b/sbdb/0.log" Apr 16 10:55:20.668802 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:20.668723 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rk78v_2bdec43e-45b2-4aa1-8605-fa162a8fc58b/ovnkube-controller/0.log" Apr 16 10:55:22.109288 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:22.109252 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-nz7jd_19d87478-ac42-469f-a6d5-8ca391f485f6/network-check-target-container/0.log" Apr 16 10:55:22.992382 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:22.992353 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-htz8x_eb2d0ec7-6730-4604-8528-e4a98ff4857d/iptables-alerter/0.log" Apr 16 10:55:23.588038 ip-10-0-135-1 kubenswrapper[2525]: I0416 10:55:23.588013 2525 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-mfrrt_0d5f6a61-ffda-4acb-83ec-7d961f2ef458/tuned/0.log"