Apr 17 11:15:37.003045 ip-10-0-133-190 systemd[1]: Starting Kubernetes Kubelet... Apr 17 11:15:37.506425 ip-10-0-133-190 kubenswrapper[2579]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:15:37.506425 ip-10-0-133-190 kubenswrapper[2579]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 11:15:37.506425 ip-10-0-133-190 kubenswrapper[2579]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:15:37.506425 ip-10-0-133-190 kubenswrapper[2579]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 11:15:37.506425 ip-10-0-133-190 kubenswrapper[2579]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:15:37.509841 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.509740 2579 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 11:15:37.512944 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.512927 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:15:37.512944 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.512944 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:15:37.513010 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.512948 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:15:37.513010 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.512951 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:15:37.513010 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.512955 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:15:37.513010 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.512957 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:15:37.513010 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.512960 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:15:37.513010 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.512963 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:15:37.513010 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.512966 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:15:37.513010 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.512969 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:15:37.513010 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.512972 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:15:37.513010 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.512975 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:15:37.513010 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.512985 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:15:37.513010 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.512988 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:15:37.513010 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.512991 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:15:37.513010 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.512993 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:15:37.513010 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.512996 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:15:37.513010 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.512998 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:15:37.513010 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513001 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:15:37.513010 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513004 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:15:37.513010 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513007 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:15:37.513010 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513009 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:15:37.513484 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513012 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:15:37.513484 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513015 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:15:37.513484 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513018 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:15:37.513484 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513021 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:15:37.513484 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513023 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:15:37.513484 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513026 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:15:37.513484 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513029 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:15:37.513484 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513032 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:15:37.513484 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513035 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:15:37.513484 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513038 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:15:37.513484 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513040 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:15:37.513484 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513043 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:15:37.513484 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513045 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:15:37.513484 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513048 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:15:37.513484 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513050 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:15:37.513484 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513053 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:15:37.513484 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513056 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:15:37.513484 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513058 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:15:37.513484 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513061 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:15:37.513484 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513063 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:15:37.513986 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513065 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:15:37.513986 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513068 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:15:37.513986 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513070 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:15:37.513986 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513073 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:15:37.513986 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513077 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:15:37.513986 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513080 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:15:37.513986 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513082 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:15:37.513986 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513085 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:15:37.513986 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513088 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:15:37.513986 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513090 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:15:37.513986 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513093 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:15:37.513986 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513095 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:15:37.513986 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513098 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:15:37.513986 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513101 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:15:37.513986 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513104 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:15:37.513986 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513107 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:15:37.513986 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513109 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:15:37.513986 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513112 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:15:37.513986 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513116 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:15:37.514440 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513120 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:15:37.514440 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513123 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:15:37.514440 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513125 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:15:37.514440 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513129 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:15:37.514440 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513131 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:15:37.514440 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513134 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:15:37.514440 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513136 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:15:37.514440 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513139 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:15:37.514440 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513142 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:15:37.514440 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513144 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:15:37.514440 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513147 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:15:37.514440 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513149 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:15:37.514440 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513151 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:15:37.514440 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513154 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:15:37.514440 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513157 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:15:37.514440 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513159 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:15:37.514440 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513162 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:15:37.514440 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513165 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:15:37.514440 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513169 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:15:37.514908 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513172 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:15:37.514908 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513175 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:15:37.514908 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513178 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:15:37.514908 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513181 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:15:37.514908 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513184 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:15:37.514908 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513187 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:15:37.514908 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513576 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:15:37.514908 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513582 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:15:37.514908 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513585 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:15:37.514908 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513587 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:15:37.514908 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513590 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:15:37.514908 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513593 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:15:37.514908 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513595 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:15:37.514908 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513598 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:15:37.514908 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513601 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:15:37.514908 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513603 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:15:37.514908 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513606 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:15:37.514908 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513609 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:15:37.514908 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513611 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:15:37.515354 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513614 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:15:37.515354 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513616 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:15:37.515354 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513619 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:15:37.515354 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513621 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:15:37.515354 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513624 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:15:37.515354 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513626 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:15:37.515354 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513629 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:15:37.515354 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513633 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:15:37.515354 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513637 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:15:37.515354 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513640 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:15:37.515354 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513642 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:15:37.515354 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513647 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:15:37.515354 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513650 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:15:37.515354 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513653 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:15:37.515354 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513656 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:15:37.515354 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513659 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:15:37.515354 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513661 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:15:37.515354 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513663 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:15:37.515354 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513666 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:15:37.515844 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513669 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:15:37.515844 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513673 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:15:37.515844 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513676 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:15:37.515844 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513679 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:15:37.515844 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513682 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:15:37.515844 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513684 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:15:37.515844 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513687 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:15:37.515844 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513689 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:15:37.515844 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513692 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:15:37.515844 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513695 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:15:37.515844 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513697 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:15:37.515844 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513700 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:15:37.515844 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513702 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:15:37.515844 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513704 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:15:37.515844 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513707 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:15:37.515844 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513710 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:15:37.515844 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513712 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:15:37.515844 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513715 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:15:37.515844 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513717 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:15:37.515844 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513720 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:15:37.516326 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513722 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:15:37.516326 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513725 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:15:37.516326 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513727 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:15:37.516326 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513731 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:15:37.516326 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513734 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:15:37.516326 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513738 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:15:37.516326 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513740 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:15:37.516326 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513743 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:15:37.516326 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513746 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:15:37.516326 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513748 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:15:37.516326 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513751 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:15:37.516326 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513754 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:15:37.516326 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513757 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:15:37.516326 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513759 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:15:37.516326 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513762 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:15:37.516326 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513778 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:15:37.516326 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513782 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:15:37.516326 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513784 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:15:37.516326 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513787 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:15:37.516326 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513790 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:15:37.516842 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513792 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:15:37.516842 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513796 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:15:37.516842 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513799 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:15:37.516842 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513802 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:15:37.516842 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513805 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:15:37.516842 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513807 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:15:37.516842 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513810 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:15:37.516842 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513812 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:15:37.516842 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513815 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:15:37.516842 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513817 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:15:37.516842 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513820 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:15:37.516842 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513822 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:15:37.516842 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513825 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:15:37.516842 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.513827 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:15:37.516842 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514831 2579 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 11:15:37.516842 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514846 2579 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 11:15:37.516842 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514853 2579 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 11:15:37.516842 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514857 2579 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 11:15:37.516842 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514863 2579 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 11:15:37.516842 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514867 2579 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 11:15:37.516842 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514871 2579 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 11:15:37.517628 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514876 2579 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 11:15:37.517628 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514879 2579 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 11:15:37.517628 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514882 2579 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 11:15:37.517628 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514886 2579 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 11:15:37.517628 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514890 2579 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 11:15:37.517628 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514893 2579 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 11:15:37.517628 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514896 2579 flags.go:64] FLAG: --cgroup-root="" Apr 17 11:15:37.517628 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514899 2579 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 11:15:37.517628 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514902 2579 flags.go:64] FLAG: --client-ca-file="" Apr 17 11:15:37.517628 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514906 2579 flags.go:64] FLAG: --cloud-config="" Apr 17 11:15:37.517628 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514909 2579 flags.go:64] FLAG: --cloud-provider="external" Apr 17 11:15:37.517628 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514912 2579 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 11:15:37.517628 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514917 2579 flags.go:64] FLAG: --cluster-domain="" Apr 17 11:15:37.517628 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514921 2579 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 11:15:37.517628 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514924 2579 flags.go:64] FLAG: --config-dir="" Apr 17 11:15:37.517628 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514927 2579 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 11:15:37.517628 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514931 2579 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 11:15:37.517628 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514935 2579 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 11:15:37.517628 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514938 2579 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 11:15:37.517628 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514942 2579 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 11:15:37.517628 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514945 2579 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 11:15:37.517628 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514948 2579 flags.go:64] FLAG: --contention-profiling="false" Apr 17 11:15:37.517628 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514951 2579 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 11:15:37.517628 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514954 2579 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 11:15:37.518297 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514958 2579 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 11:15:37.518297 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514961 2579 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 11:15:37.518297 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514965 2579 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 11:15:37.518297 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514968 2579 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 11:15:37.518297 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514971 2579 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 11:15:37.518297 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514974 2579 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 11:15:37.518297 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514978 2579 flags.go:64] FLAG: --enable-server="true" Apr 17 11:15:37.518297 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514981 2579 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 11:15:37.518297 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514986 2579 flags.go:64] FLAG: --event-burst="100" Apr 17 11:15:37.518297 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514989 2579 flags.go:64] FLAG: --event-qps="50" Apr 17 11:15:37.518297 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514992 2579 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 11:15:37.518297 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514995 2579 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 11:15:37.518297 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.514998 2579 flags.go:64] FLAG: --eviction-hard="" Apr 17 11:15:37.518297 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515002 2579 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 11:15:37.518297 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515005 2579 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 11:15:37.518297 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515008 2579 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 11:15:37.518297 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515011 2579 flags.go:64] FLAG: --eviction-soft="" Apr 17 11:15:37.518297 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515014 2579 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 11:15:37.518297 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515017 2579 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 11:15:37.518297 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515020 2579 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 11:15:37.518297 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515028 2579 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 11:15:37.518297 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515031 2579 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 11:15:37.518297 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515034 2579 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 11:15:37.518297 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515037 2579 flags.go:64] FLAG: --feature-gates="" Apr 17 11:15:37.518297 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515044 2579 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 11:15:37.518917 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515050 2579 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 11:15:37.518917 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515053 2579 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 11:15:37.518917 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515057 2579 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 11:15:37.518917 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515060 2579 flags.go:64] FLAG: --healthz-port="10248" Apr 17 11:15:37.518917 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515064 2579 flags.go:64] FLAG: --help="false" Apr 17 11:15:37.518917 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515066 2579 flags.go:64] FLAG: --hostname-override="ip-10-0-133-190.ec2.internal" Apr 17 11:15:37.518917 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515070 2579 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 11:15:37.518917 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515073 2579 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 11:15:37.518917 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515076 2579 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 11:15:37.518917 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515079 2579 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 11:15:37.518917 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515082 2579 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 11:15:37.518917 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515085 2579 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 11:15:37.518917 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515088 2579 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 11:15:37.518917 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515091 2579 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 11:15:37.518917 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515094 2579 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 11:15:37.518917 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515098 2579 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 11:15:37.518917 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515101 2579 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 11:15:37.518917 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515104 2579 flags.go:64] FLAG: --kube-reserved="" Apr 17 11:15:37.518917 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515107 2579 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 11:15:37.518917 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515110 2579 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 11:15:37.518917 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515113 2579 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 11:15:37.518917 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515116 2579 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 11:15:37.518917 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515118 2579 flags.go:64] FLAG: --lock-file="" Apr 17 11:15:37.518917 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515121 2579 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 11:15:37.519503 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515124 2579 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 11:15:37.519503 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515127 2579 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 11:15:37.519503 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515133 2579 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 11:15:37.519503 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515137 2579 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 11:15:37.519503 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515140 2579 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 11:15:37.519503 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515143 2579 flags.go:64] FLAG: --logging-format="text" Apr 17 11:15:37.519503 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515146 2579 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 11:15:37.519503 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515150 2579 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 11:15:37.519503 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515154 2579 flags.go:64] FLAG: --manifest-url="" Apr 17 11:15:37.519503 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515157 2579 flags.go:64] FLAG: --manifest-url-header="" Apr 17 11:15:37.519503 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515161 2579 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 11:15:37.519503 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515165 2579 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 11:15:37.519503 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515169 2579 flags.go:64] FLAG: --max-pods="110" Apr 17 11:15:37.519503 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515172 2579 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 11:15:37.519503 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515175 2579 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 11:15:37.519503 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515178 2579 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 11:15:37.519503 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515181 2579 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 11:15:37.519503 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515184 2579 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 11:15:37.519503 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515187 2579 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 11:15:37.519503 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515190 2579 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 11:15:37.519503 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515198 2579 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 11:15:37.519503 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515201 2579 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 11:15:37.519503 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515204 2579 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 11:15:37.519503 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515208 2579 flags.go:64] FLAG: --pod-cidr="" Apr 17 11:15:37.520128 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515211 2579 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 11:15:37.520128 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515217 2579 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 11:15:37.520128 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515220 2579 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 11:15:37.520128 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515223 2579 flags.go:64] FLAG: --pods-per-core="0" Apr 17 11:15:37.520128 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515226 2579 flags.go:64] FLAG: --port="10250" Apr 17 11:15:37.520128 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515229 2579 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 11:15:37.520128 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515233 2579 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-062d1804f617ca6c9" Apr 17 11:15:37.520128 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515236 2579 flags.go:64] FLAG: --qos-reserved="" Apr 17 11:15:37.520128 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515239 2579 flags.go:64] FLAG: --read-only-port="10255" Apr 17 11:15:37.520128 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515242 2579 flags.go:64] FLAG: --register-node="true" Apr 17 11:15:37.520128 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515245 2579 flags.go:64] FLAG: --register-schedulable="true" Apr 17 11:15:37.520128 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515249 2579 flags.go:64] FLAG: --register-with-taints="" Apr 17 11:15:37.520128 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515253 2579 flags.go:64] FLAG: --registry-burst="10" Apr 17 11:15:37.520128 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515256 2579 flags.go:64] FLAG: --registry-qps="5" Apr 17 11:15:37.520128 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515258 2579 flags.go:64] FLAG: --reserved-cpus="" Apr 17 11:15:37.520128 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515261 2579 flags.go:64] FLAG: --reserved-memory="" Apr 17 11:15:37.520128 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515265 2579 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 11:15:37.520128 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515269 2579 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 11:15:37.520128 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515273 2579 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 11:15:37.520128 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515275 2579 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 11:15:37.520128 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515278 2579 flags.go:64] FLAG: --runonce="false" Apr 17 11:15:37.520128 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515281 2579 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 11:15:37.520128 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515284 2579 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 11:15:37.520128 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515287 2579 flags.go:64] FLAG: --seccomp-default="false" Apr 17 11:15:37.520128 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515290 2579 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 11:15:37.520731 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515293 2579 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 11:15:37.520731 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515296 2579 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 11:15:37.520731 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515299 2579 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 11:15:37.520731 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515302 2579 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 11:15:37.520731 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515305 2579 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 11:15:37.520731 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515308 2579 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 11:15:37.520731 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515311 2579 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 11:15:37.520731 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515314 2579 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 11:15:37.520731 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515318 2579 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 11:15:37.520731 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515321 2579 flags.go:64] FLAG: --system-cgroups="" Apr 17 11:15:37.520731 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515324 2579 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 11:15:37.520731 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515329 2579 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 11:15:37.520731 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515332 2579 flags.go:64] FLAG: --tls-cert-file="" Apr 17 11:15:37.520731 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515335 2579 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 11:15:37.520731 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515340 2579 flags.go:64] FLAG: --tls-min-version="" Apr 17 11:15:37.520731 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515343 2579 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 11:15:37.520731 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515346 2579 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 11:15:37.520731 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515348 2579 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 11:15:37.520731 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515352 2579 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 11:15:37.520731 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515355 2579 flags.go:64] FLAG: --v="2" Apr 17 11:15:37.520731 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515360 2579 flags.go:64] FLAG: --version="false" Apr 17 11:15:37.520731 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515365 2579 flags.go:64] FLAG: --vmodule="" Apr 17 11:15:37.520731 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515369 2579 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 11:15:37.520731 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.515373 2579 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 11:15:37.520731 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515494 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:15:37.521386 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515499 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:15:37.521386 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515502 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:15:37.521386 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515505 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:15:37.521386 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515508 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:15:37.521386 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515510 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:15:37.521386 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515513 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:15:37.521386 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515516 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:15:37.521386 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515518 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:15:37.521386 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515521 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:15:37.521386 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515524 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:15:37.521386 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515526 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:15:37.521386 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515529 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:15:37.521386 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515532 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:15:37.521386 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515535 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:15:37.521386 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515537 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:15:37.521386 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515541 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:15:37.521386 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515544 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:15:37.521386 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515547 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:15:37.521386 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515549 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:15:37.521386 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515552 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:15:37.521917 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515554 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:15:37.521917 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515557 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:15:37.521917 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515559 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:15:37.521917 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515562 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:15:37.521917 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515564 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:15:37.521917 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515568 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:15:37.521917 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515571 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:15:37.521917 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515573 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:15:37.521917 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515576 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:15:37.521917 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515579 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:15:37.521917 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515581 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:15:37.521917 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515585 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:15:37.521917 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515588 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:15:37.521917 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515590 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:15:37.521917 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515594 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:15:37.521917 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515597 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:15:37.521917 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515600 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:15:37.521917 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515603 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:15:37.521917 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515605 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:15:37.522653 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515608 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:15:37.522653 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515610 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:15:37.522653 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515613 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:15:37.522653 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515615 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:15:37.522653 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515618 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:15:37.522653 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515621 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:15:37.522653 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515623 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:15:37.522653 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515626 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:15:37.522653 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515628 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:15:37.522653 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515632 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:15:37.522653 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515634 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:15:37.522653 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515636 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:15:37.522653 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515639 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:15:37.522653 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515642 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:15:37.522653 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515644 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:15:37.522653 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515647 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:15:37.522653 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515650 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:15:37.522653 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515652 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:15:37.522653 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515656 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:15:37.522653 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515659 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:15:37.523543 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515661 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:15:37.523543 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515664 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:15:37.523543 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515666 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:15:37.523543 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515670 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:15:37.523543 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515675 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:15:37.523543 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515678 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:15:37.523543 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515681 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:15:37.523543 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515684 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:15:37.523543 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515687 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:15:37.523543 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515690 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:15:37.523543 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515692 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:15:37.523543 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515695 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:15:37.523543 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515697 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:15:37.523543 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515700 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:15:37.523543 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515702 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:15:37.523543 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515705 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:15:37.523543 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515707 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:15:37.523543 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515710 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:15:37.523543 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515712 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:15:37.524360 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515715 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:15:37.524360 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515717 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:15:37.524360 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515720 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:15:37.524360 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515723 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:15:37.524360 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515726 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:15:37.524360 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515728 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:15:37.524360 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.515731 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:15:37.524360 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.516638 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:15:37.524360 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.524061 2579 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 11:15:37.524360 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.524083 2579 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 11:15:37.524360 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524157 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:15:37.524360 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524164 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:15:37.524360 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524170 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:15:37.524360 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524175 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:15:37.524360 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524180 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:15:37.524360 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524185 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:15:37.525092 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524190 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:15:37.525092 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524194 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:15:37.525092 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524199 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:15:37.525092 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524204 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:15:37.525092 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524209 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:15:37.525092 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524214 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:15:37.525092 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524218 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:15:37.525092 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524223 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:15:37.525092 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524227 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:15:37.525092 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524231 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:15:37.525092 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524236 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:15:37.525092 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524240 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:15:37.525092 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524244 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:15:37.525092 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524248 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:15:37.525092 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524253 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:15:37.525092 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524257 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:15:37.525092 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524264 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:15:37.525092 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524272 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:15:37.525092 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524277 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:15:37.525092 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524282 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:15:37.525679 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524287 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:15:37.525679 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524291 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:15:37.525679 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524296 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:15:37.525679 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524300 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:15:37.525679 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524304 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:15:37.525679 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524309 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:15:37.525679 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524314 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:15:37.525679 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524318 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:15:37.525679 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524322 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:15:37.525679 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524326 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:15:37.525679 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524331 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:15:37.525679 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524335 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:15:37.525679 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524339 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:15:37.525679 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524343 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:15:37.525679 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524347 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:15:37.525679 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524351 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:15:37.525679 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524355 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:15:37.525679 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524359 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:15:37.525679 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524363 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:15:37.525679 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524367 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:15:37.526285 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524371 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:15:37.526285 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524375 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:15:37.526285 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524379 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:15:37.526285 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524383 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:15:37.526285 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524388 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:15:37.526285 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524394 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:15:37.526285 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524400 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:15:37.526285 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524405 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:15:37.526285 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524410 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:15:37.526285 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524414 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:15:37.526285 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524418 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:15:37.526285 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524422 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:15:37.526285 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524426 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:15:37.526285 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524430 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:15:37.526285 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524435 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:15:37.526285 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524439 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:15:37.526285 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524443 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:15:37.526285 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524449 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:15:37.526285 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524453 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:15:37.526750 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524457 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:15:37.526750 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524462 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:15:37.526750 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524466 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:15:37.526750 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524470 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:15:37.526750 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524475 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:15:37.526750 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524479 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:15:37.526750 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524483 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:15:37.526750 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524488 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:15:37.526750 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524492 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:15:37.526750 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524496 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:15:37.526750 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524500 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:15:37.526750 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524505 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:15:37.526750 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524510 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:15:37.526750 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524514 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:15:37.526750 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524518 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:15:37.526750 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524522 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:15:37.526750 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524526 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:15:37.526750 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524530 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:15:37.526750 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524535 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:15:37.526750 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524539 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:15:37.527435 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524543 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:15:37.527435 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.524551 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:15:37.527435 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524712 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:15:37.527435 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524720 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:15:37.527435 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524725 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:15:37.527435 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524730 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:15:37.527435 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524735 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:15:37.527435 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524739 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:15:37.527435 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524744 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:15:37.527435 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524748 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:15:37.527435 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524753 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:15:37.527435 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524758 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:15:37.527435 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524762 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:15:37.527435 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524787 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:15:37.527435 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524792 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:15:37.527435 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524796 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:15:37.528130 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524801 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:15:37.528130 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524807 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:15:37.528130 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524814 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:15:37.528130 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524818 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:15:37.528130 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524822 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:15:37.528130 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524827 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:15:37.528130 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524832 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:15:37.528130 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524836 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:15:37.528130 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524841 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:15:37.528130 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524846 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:15:37.528130 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524850 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:15:37.528130 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524854 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:15:37.528130 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524858 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:15:37.528130 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524863 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:15:37.528130 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524867 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:15:37.528130 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524871 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:15:37.528130 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524875 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:15:37.528130 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524879 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:15:37.528130 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524884 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:15:37.528617 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524891 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:15:37.528617 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524896 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:15:37.528617 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524901 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:15:37.528617 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524906 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:15:37.528617 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524911 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:15:37.528617 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524916 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:15:37.528617 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524920 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:15:37.528617 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524925 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:15:37.528617 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524930 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:15:37.528617 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524936 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:15:37.528617 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524940 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:15:37.528617 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524944 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:15:37.528617 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524948 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:15:37.528617 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524952 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:15:37.528617 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524957 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:15:37.528617 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524961 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:15:37.528617 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524965 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:15:37.528617 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524969 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:15:37.528617 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524973 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:15:37.528617 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524977 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:15:37.529534 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524982 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:15:37.529534 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524986 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:15:37.529534 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524991 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:15:37.529534 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524995 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:15:37.529534 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.524999 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:15:37.529534 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.525003 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:15:37.529534 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.525007 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:15:37.529534 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.525011 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:15:37.529534 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.525015 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:15:37.529534 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.525020 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:15:37.529534 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.525024 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:15:37.529534 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.525029 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:15:37.529534 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.525033 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:15:37.529534 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.525037 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:15:37.529534 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.525041 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:15:37.529534 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.525045 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:15:37.529534 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.525049 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:15:37.529534 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.525054 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:15:37.529534 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.525058 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:15:37.529534 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.525064 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:15:37.530131 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.525068 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:15:37.530131 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.525073 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:15:37.530131 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.525077 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:15:37.530131 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.525081 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:15:37.530131 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.525085 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:15:37.530131 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.525089 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:15:37.530131 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.525093 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:15:37.530131 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.525097 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:15:37.530131 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.525101 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:15:37.530131 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.525105 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:15:37.530131 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.525109 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:15:37.530131 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.525113 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:15:37.530131 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:37.525118 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:15:37.530131 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.525125 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:15:37.530131 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.526921 2579 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 11:15:37.530657 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.530237 2579 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 11:15:37.531341 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.531327 2579 server.go:1019] "Starting client certificate rotation" Apr 17 11:15:37.531450 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.531431 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 11:15:37.531819 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.531807 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 11:15:37.561366 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.561334 2579 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 11:15:37.565561 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.565539 2579 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 11:15:37.582677 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.582650 2579 log.go:25] "Validated CRI v1 runtime API" Apr 17 11:15:37.590040 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.590017 2579 log.go:25] "Validated CRI v1 image API" Apr 17 11:15:37.591198 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.591175 2579 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 11:15:37.593857 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.593832 2579 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 9ca9c515-e01e-41a5-8f53-a4528e386e9e:/dev/nvme0n1p3 d57fb5c8-6d4d-4092-bbe9-2b1f00f43f67:/dev/nvme0n1p4] Apr 17 11:15:37.593946 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.593856 2579 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 11:15:37.594999 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.594983 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 11:15:37.599981 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.599876 2579 manager.go:217] Machine: {Timestamp:2026-04-17 11:15:37.597692487 +0000 UTC m=+0.464216628 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3103318 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec26c421159499e8dfcc52747c5bbe53 SystemUUID:ec26c421-1594-99e8-dfcc-52747c5bbe53 BootID:dbcc1502-569e-481a-9a9e-92438a839fb0 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:5b:55:42:49:0d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:5b:55:42:49:0d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:2a:10:7e:d5:17:9b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 11:15:37.599981 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.599976 2579 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 11:15:37.600094 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.600061 2579 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 11:15:37.602285 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.602259 2579 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 11:15:37.602436 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.602288 2579 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-190.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 11:15:37.602484 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.602446 2579 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 11:15:37.602484 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.602455 2579 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 11:15:37.602484 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.602468 2579 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 11:15:37.602563 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.602488 2579 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 11:15:37.603718 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.603707 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 17 11:15:37.603846 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.603837 2579 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 11:15:37.606963 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.606953 2579 kubelet.go:491] "Attempting to sync node with API server" Apr 17 11:15:37.607000 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.606971 2579 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 11:15:37.607000 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.606983 2579 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 11:15:37.607000 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.606992 2579 kubelet.go:397] "Adding apiserver pod source" Apr 17 11:15:37.607110 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.607009 2579 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 11:15:37.608350 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.608339 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 11:15:37.608395 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.608358 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 11:15:37.611979 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.611964 2579 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 11:15:37.613698 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.613685 2579 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 11:15:37.614958 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.614947 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 11:15:37.614994 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.614964 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 11:15:37.614994 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.614973 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 11:15:37.614994 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.614981 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 11:15:37.614994 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.614989 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 11:15:37.614994 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.614995 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 11:15:37.615132 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.615001 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 11:15:37.615132 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.615009 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 11:15:37.615132 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.615018 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 11:15:37.615132 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.615024 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 11:15:37.615132 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.615042 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 11:15:37.615132 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.615051 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 11:15:37.616146 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.616134 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 11:15:37.616186 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.616147 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 11:15:37.619676 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.619662 2579 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 11:15:37.619754 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.619697 2579 server.go:1295] "Started kubelet" Apr 17 11:15:37.619839 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.619817 2579 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 11:15:37.619967 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.619799 2579 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 11:15:37.620026 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.619988 2579 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 11:15:37.620405 ip-10-0-133-190 systemd[1]: Started Kubernetes Kubelet. Apr 17 11:15:37.621477 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.621460 2579 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 11:15:37.622724 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.622710 2579 server.go:317] "Adding debug handlers to kubelet server" Apr 17 11:15:37.623668 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.623587 2579 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-133-190.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 11:15:37.623668 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:37.623624 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-190.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 11:15:37.624028 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:37.623994 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 11:15:37.627146 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.627129 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 11:15:37.627848 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.627831 2579 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 11:15:37.628680 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.628656 2579 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 11:15:37.628759 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.628685 2579 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 11:15:37.628759 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.628662 2579 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 11:15:37.628865 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.628807 2579 reconstruct.go:97] "Volume reconstruction finished" Apr 17 11:15:37.628865 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.628815 2579 reconciler.go:26] "Reconciler: start to sync state" Apr 17 11:15:37.628966 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:37.628843 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-190.ec2.internal\" not found" Apr 17 11:15:37.629048 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.629024 2579 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 11:15:37.629048 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.629048 2579 factory.go:55] Registering systemd factory Apr 17 11:15:37.629212 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.629059 2579 factory.go:223] Registration of the systemd container factory successfully Apr 17 11:15:37.629346 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.629326 2579 factory.go:153] Registering CRI-O factory Apr 17 11:15:37.629346 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.629347 2579 factory.go:223] Registration of the crio container factory successfully Apr 17 11:15:37.629516 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.629370 2579 factory.go:103] Registering Raw factory Apr 17 11:15:37.629516 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.629383 2579 manager.go:1196] Started watching for new ooms in manager Apr 17 11:15:37.630405 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:37.630317 2579 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 11:15:37.630676 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.630661 2579 manager.go:319] Starting recovery of all containers Apr 17 11:15:37.634714 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:37.634676 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 11:15:37.634890 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:37.634862 2579 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-133-190.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 11:15:37.636171 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:37.634735 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-190.ec2.internal.18a720b47f1bb3b6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-190.ec2.internal,UID:ip-10-0-133-190.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-133-190.ec2.internal,},FirstTimestamp:2026-04-17 11:15:37.619674038 +0000 UTC m=+0.486198179,LastTimestamp:2026-04-17 11:15:37.619674038 +0000 UTC m=+0.486198179,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-190.ec2.internal,}" Apr 17 11:15:37.640409 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.640278 2579 manager.go:324] Recovery completed Apr 17 11:15:37.641587 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:37.641571 2579 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 17 11:15:37.642302 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.642283 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dmjm9" Apr 17 11:15:37.644669 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.644657 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:15:37.647012 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.646996 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-190.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:15:37.647081 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.647027 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-190.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:15:37.647081 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.647040 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-190.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:15:37.647521 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.647508 2579 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 11:15:37.647521 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.647518 2579 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 11:15:37.647600 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.647552 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 17 11:15:37.649306 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:37.649245 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-190.ec2.internal.18a720b480bcd100 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-190.ec2.internal,UID:ip-10-0-133-190.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-133-190.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-133-190.ec2.internal,},FirstTimestamp:2026-04-17 11:15:37.647010048 +0000 UTC m=+0.513534188,LastTimestamp:2026-04-17 11:15:37.647010048 +0000 UTC m=+0.513534188,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-190.ec2.internal,}" Apr 17 11:15:37.649898 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.649884 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dmjm9" Apr 17 11:15:37.650745 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.650734 2579 policy_none.go:49] "None policy: Start" Apr 17 11:15:37.650802 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.650749 2579 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 11:15:37.650802 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.650759 2579 state_mem.go:35] "Initializing new in-memory state store" Apr 17 11:15:37.686367 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.686350 2579 manager.go:341] "Starting Device Plugin manager" Apr 17 11:15:37.710594 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:37.686383 2579 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 11:15:37.710594 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.686394 2579 server.go:85] "Starting device plugin registration server" Apr 17 11:15:37.710594 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.686635 2579 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 11:15:37.710594 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.686648 2579 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 11:15:37.710594 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.686756 2579 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 11:15:37.710594 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.686847 2579 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 11:15:37.710594 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.686856 2579 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 11:15:37.710594 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:37.687364 2579 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 11:15:37.710594 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:37.687408 2579 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-190.ec2.internal\" not found" Apr 17 11:15:37.710594 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.697326 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 11:15:37.710594 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.698496 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 11:15:37.710594 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.698517 2579 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 11:15:37.710594 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.698535 2579 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 11:15:37.710594 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.698543 2579 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 11:15:37.710594 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:37.698581 2579 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 11:15:37.710594 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.701111 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:15:37.787694 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.787665 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:15:37.789175 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.789158 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-190.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:15:37.789311 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.789187 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-190.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:15:37.789311 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.789200 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-190.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:15:37.789311 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.789231 2579 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-190.ec2.internal" Apr 17 11:15:37.796281 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.796263 2579 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-190.ec2.internal" Apr 17 11:15:37.796381 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:37.796287 2579 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-190.ec2.internal\": node \"ip-10-0-133-190.ec2.internal\" not found" Apr 17 11:15:37.799601 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.799566 2579 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-190.ec2.internal"] Apr 17 11:15:37.799707 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.799646 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:15:37.800755 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.800735 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-190.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:15:37.800851 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.800782 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-190.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:15:37.800851 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.800795 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-190.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:15:37.803011 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.802997 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:15:37.803159 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.803144 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal" Apr 17 11:15:37.803201 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.803175 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:15:37.804104 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.804086 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-190.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:15:37.804174 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.804118 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-190.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:15:37.804174 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.804129 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-190.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:15:37.804238 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.804086 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-190.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:15:37.804238 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.804195 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-190.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:15:37.804238 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.804219 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-190.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:15:37.806461 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.806448 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-190.ec2.internal" Apr 17 11:15:37.806511 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.806472 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:15:37.807415 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.807398 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-190.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:15:37.807525 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.807425 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-190.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:15:37.807525 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.807435 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-190.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:15:37.816367 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:37.816346 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-190.ec2.internal\" not found" Apr 17 11:15:37.828594 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:37.828572 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-190.ec2.internal\" not found" node="ip-10-0-133-190.ec2.internal" Apr 17 11:15:37.829609 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.829588 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4de89c435edead38b0b3647c59aadbd2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal\" (UID: \"4de89c435edead38b0b3647c59aadbd2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal" Apr 17 11:15:37.829723 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.829623 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1a8754b0f8554418b50db3ceda052dae-config\") pod \"kube-apiserver-proxy-ip-10-0-133-190.ec2.internal\" (UID: \"1a8754b0f8554418b50db3ceda052dae\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-190.ec2.internal" Apr 17 11:15:37.829723 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.829646 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4de89c435edead38b0b3647c59aadbd2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal\" (UID: \"4de89c435edead38b0b3647c59aadbd2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal" Apr 17 11:15:37.833283 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:37.833267 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-190.ec2.internal\" not found" node="ip-10-0-133-190.ec2.internal" Apr 17 11:15:37.916810 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:37.916782 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-190.ec2.internal\" not found" Apr 17 11:15:37.930669 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.930643 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1a8754b0f8554418b50db3ceda052dae-config\") pod \"kube-apiserver-proxy-ip-10-0-133-190.ec2.internal\" (UID: \"1a8754b0f8554418b50db3ceda052dae\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-190.ec2.internal" Apr 17 11:15:37.930732 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.930675 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4de89c435edead38b0b3647c59aadbd2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal\" (UID: \"4de89c435edead38b0b3647c59aadbd2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal" Apr 17 11:15:37.930732 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.930690 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4de89c435edead38b0b3647c59aadbd2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal\" (UID: \"4de89c435edead38b0b3647c59aadbd2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal" Apr 17 11:15:37.930827 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.930754 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4de89c435edead38b0b3647c59aadbd2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal\" (UID: \"4de89c435edead38b0b3647c59aadbd2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal" Apr 17 11:15:37.930827 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.930757 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4de89c435edead38b0b3647c59aadbd2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal\" (UID: \"4de89c435edead38b0b3647c59aadbd2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal" Apr 17 11:15:37.930827 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:37.930757 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1a8754b0f8554418b50db3ceda052dae-config\") pod \"kube-apiserver-proxy-ip-10-0-133-190.ec2.internal\" (UID: \"1a8754b0f8554418b50db3ceda052dae\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-190.ec2.internal" Apr 17 11:15:38.017240 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:38.017152 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-190.ec2.internal\" not found" Apr 17 11:15:38.118030 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:38.117992 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-190.ec2.internal\" not found" Apr 17 11:15:38.130312 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:38.130285 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal" Apr 17 11:15:38.135046 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:38.135028 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-190.ec2.internal" Apr 17 11:15:38.218605 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:38.218569 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-190.ec2.internal\" not found" Apr 17 11:15:38.319153 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:38.319086 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-190.ec2.internal\" not found" Apr 17 11:15:38.419659 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:38.419630 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-190.ec2.internal\" not found" Apr 17 11:15:38.520401 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:38.520371 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-190.ec2.internal\" not found" Apr 17 11:15:38.531733 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:38.531709 2579 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 11:15:38.531910 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:38.531883 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 11:15:38.621441 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:38.621405 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-190.ec2.internal\" not found" Apr 17 11:15:38.627237 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:38.627214 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 11:15:38.642716 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:38.642689 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 11:15:38.651463 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:38.651435 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 11:10:37 +0000 UTC" deadline="2027-12-26 19:55:16.394435168 +0000 UTC" Apr 17 11:15:38.651463 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:38.651459 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14840h39m37.742978537s" Apr 17 11:15:38.669224 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:38.669194 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-lwpzm" Apr 17 11:15:38.676924 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:38.676907 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-lwpzm" Apr 17 11:15:38.698515 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:38.698497 2579 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:15:38.706744 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:38.706709 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4de89c435edead38b0b3647c59aadbd2.slice/crio-8c23cd6ea5647f84693305cb024b203bb7414d0efc14792f5a0c7e65c9272d70 WatchSource:0}: Error finding container 8c23cd6ea5647f84693305cb024b203bb7414d0efc14792f5a0c7e65c9272d70: Status 404 returned error can't find the container with id 8c23cd6ea5647f84693305cb024b203bb7414d0efc14792f5a0c7e65c9272d70 Apr 17 11:15:38.707214 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:38.707182 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a8754b0f8554418b50db3ceda052dae.slice/crio-7bb6e34c1515ce8ce551254076b5ba6558e9f2b576109dac877ad401533da10c WatchSource:0}: Error finding container 7bb6e34c1515ce8ce551254076b5ba6558e9f2b576109dac877ad401533da10c: Status 404 returned error can't find the container with id 7bb6e34c1515ce8ce551254076b5ba6558e9f2b576109dac877ad401533da10c Apr 17 11:15:38.711901 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:38.711883 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 11:15:38.728369 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:38.728343 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-190.ec2.internal" Apr 17 11:15:38.735956 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:38.735935 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 11:15:38.737457 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:38.737444 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal" Apr 17 11:15:38.748072 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:38.748054 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 11:15:38.838615 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:38.838552 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:15:38.857458 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:38.857437 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:15:39.516412 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.516378 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:15:39.608889 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.608860 2579 apiserver.go:52] "Watching apiserver" Apr 17 11:15:39.617234 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.617208 2579 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 11:15:39.619151 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.619126 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-zc9p4","openshift-multus/multus-86dh5","openshift-multus/network-metrics-daemon-9g7pq","openshift-network-operator/iptables-alerter-t9jv4","kube-system/kube-apiserver-proxy-ip-10-0-133-190.ec2.internal","openshift-dns/node-resolver-jrxcq","openshift-image-registry/node-ca-2678k","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal","openshift-multus/multus-additional-cni-plugins-dvnsm","openshift-network-diagnostics/network-check-target-47nt5","openshift-ovn-kubernetes/ovnkube-node-lxcn4","kube-system/konnectivity-agent-46jrw","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrrv5"] Apr 17 11:15:39.621886 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.621860 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-47nt5" Apr 17 11:15:39.621991 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:39.621959 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-47nt5" podUID="bfa20876-9d47-42bf-aad5-24503e05b86e" Apr 17 11:15:39.624261 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.624231 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.626059 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.626032 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 11:15:39.626601 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.626365 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 11:15:39.626601 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.626377 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 11:15:39.626601 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.626423 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 11:15:39.626601 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.626555 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-sv9hn\"" Apr 17 11:15:39.628605 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.628582 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9g7pq" Apr 17 11:15:39.628702 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:39.628656 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9g7pq" podUID="0ba74b24-e523-481e-82b5-080dc7ecb2e2" Apr 17 11:15:39.628702 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.628677 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-t9jv4" Apr 17 11:15:39.630423 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.630399 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:15:39.630511 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.630461 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 11:15:39.630511 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.630499 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 11:15:39.630589 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.630405 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-7pfnk\"" Apr 17 11:15:39.631034 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.631016 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jrxcq" Apr 17 11:15:39.632551 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.632532 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 11:15:39.632792 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.632757 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 11:15:39.632881 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.632815 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-96w8j\"" Apr 17 11:15:39.633270 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.633254 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dvnsm" Apr 17 11:15:39.634835 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.634816 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 11:15:39.634991 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.634971 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 11:15:39.635095 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.635023 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-qwnw4\"" Apr 17 11:15:39.635460 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.635443 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2678k" Apr 17 11:15:39.636985 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.636965 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 11:15:39.637184 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.637161 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-gq95b\"" Apr 17 11:15:39.637268 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.637213 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 11:15:39.637328 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.637213 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 11:15:39.637823 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.637804 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.638688 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.638672 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-host-var-lib-cni-bin\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.638824 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.638700 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-host-run-netns\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.638824 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.638728 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-cnibin\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.638824 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.638793 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/389cf577-bd02-4903-96d9-cdc3fd99d418-cni-binary-copy\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.638982 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.638829 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-multus-conf-dir\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.638982 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.638857 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/389cf577-bd02-4903-96d9-cdc3fd99d418-multus-daemon-config\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.638982 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.638883 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7920eeb8-72c9-4fe5-aff5-30f78ed7f840-tmp-dir\") pod \"node-resolver-jrxcq\" (UID: \"7920eeb8-72c9-4fe5-aff5-30f78ed7f840\") " pod="openshift-dns/node-resolver-jrxcq" Apr 17 11:15:39.638982 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.638908 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbkfv\" (UniqueName: \"kubernetes.io/projected/7920eeb8-72c9-4fe5-aff5-30f78ed7f840-kube-api-access-dbkfv\") pod \"node-resolver-jrxcq\" (UID: \"7920eeb8-72c9-4fe5-aff5-30f78ed7f840\") " pod="openshift-dns/node-resolver-jrxcq" Apr 17 11:15:39.638982 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.638972 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfdz4\" (UniqueName: \"kubernetes.io/projected/bfa20876-9d47-42bf-aad5-24503e05b86e-kube-api-access-lfdz4\") pod \"network-check-target-47nt5\" (UID: \"bfa20876-9d47-42bf-aad5-24503e05b86e\") " pod="openshift-network-diagnostics/network-check-target-47nt5" Apr 17 11:15:39.639206 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.638996 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q65g\" (UniqueName: \"kubernetes.io/projected/ec49f65a-cac1-4bb8-8dd5-f77b34ef2282-kube-api-access-5q65g\") pod \"iptables-alerter-t9jv4\" (UID: \"ec49f65a-cac1-4bb8-8dd5-f77b34ef2282\") " pod="openshift-network-operator/iptables-alerter-t9jv4" Apr 17 11:15:39.639206 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.639022 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-system-cni-dir\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.639206 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.639048 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-multus-socket-dir-parent\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.639206 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.639073 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45fjw\" (UniqueName: \"kubernetes.io/projected/0ba74b24-e523-481e-82b5-080dc7ecb2e2-kube-api-access-45fjw\") pod \"network-metrics-daemon-9g7pq\" (UID: \"0ba74b24-e523-481e-82b5-080dc7ecb2e2\") " pod="openshift-multus/network-metrics-daemon-9g7pq" Apr 17 11:15:39.639206 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.639121 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ec49f65a-cac1-4bb8-8dd5-f77b34ef2282-host-slash\") pod \"iptables-alerter-t9jv4\" (UID: \"ec49f65a-cac1-4bb8-8dd5-f77b34ef2282\") " pod="openshift-network-operator/iptables-alerter-t9jv4" Apr 17 11:15:39.639206 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.639150 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-multus-cni-dir\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.639206 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.639173 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-os-release\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.639648 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.639220 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-host-var-lib-cni-multus\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.639648 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.639261 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-host-var-lib-kubelet\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.639648 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.639287 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-host-run-multus-certs\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.639648 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.639305 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:15:39.639648 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.639329 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-host-run-k8s-cni-cncf-io\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.639648 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.639392 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-hostroot\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.639648 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.639437 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dw25\" (UniqueName: \"kubernetes.io/projected/389cf577-bd02-4903-96d9-cdc3fd99d418-kube-api-access-2dw25\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.639648 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.639470 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ec49f65a-cac1-4bb8-8dd5-f77b34ef2282-iptables-alerter-script\") pod \"iptables-alerter-t9jv4\" (UID: \"ec49f65a-cac1-4bb8-8dd5-f77b34ef2282\") " pod="openshift-network-operator/iptables-alerter-t9jv4" Apr 17 11:15:39.639648 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.639512 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7920eeb8-72c9-4fe5-aff5-30f78ed7f840-hosts-file\") pod \"node-resolver-jrxcq\" (UID: \"7920eeb8-72c9-4fe5-aff5-30f78ed7f840\") " pod="openshift-dns/node-resolver-jrxcq" Apr 17 11:15:39.639648 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.639565 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-kkrpp\"" Apr 17 11:15:39.639648 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.639576 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-etc-kubernetes\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.639648 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.639609 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0ba74b24-e523-481e-82b5-080dc7ecb2e2-metrics-certs\") pod \"network-metrics-daemon-9g7pq\" (UID: \"0ba74b24-e523-481e-82b5-080dc7ecb2e2\") " pod="openshift-multus/network-metrics-daemon-9g7pq" Apr 17 11:15:39.640187 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.639708 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 11:15:39.640187 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.640154 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.641667 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.641648 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 11:15:39.641822 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.641743 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 11:15:39.641822 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.641652 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 11:15:39.642173 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.642155 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 11:15:39.642270 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.642256 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 11:15:39.642332 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.642283 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 11:15:39.642447 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.642432 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-rf766\"" Apr 17 11:15:39.644698 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.644673 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-46jrw" Apr 17 11:15:39.646861 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.646828 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 11:15:39.647092 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.646860 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 11:15:39.647092 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.647066 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-x47ml\"" Apr 17 11:15:39.647262 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.647236 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrrv5" Apr 17 11:15:39.648927 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.648907 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 11:15:39.648927 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.648918 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 11:15:39.649219 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.649203 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 11:15:39.649285 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.649229 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-qhhfc\"" Apr 17 11:15:39.677899 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.677869 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 11:10:38 +0000 UTC" deadline="2027-11-07 04:55:47.471627182 +0000 UTC" Apr 17 11:15:39.677899 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.677896 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13649h40m7.793733097s" Apr 17 11:15:39.702845 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.702797 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal" event={"ID":"4de89c435edead38b0b3647c59aadbd2","Type":"ContainerStarted","Data":"8c23cd6ea5647f84693305cb024b203bb7414d0efc14792f5a0c7e65c9272d70"} Apr 17 11:15:39.703922 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.703888 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-190.ec2.internal" event={"ID":"1a8754b0f8554418b50db3ceda052dae","Type":"ContainerStarted","Data":"7bb6e34c1515ce8ce551254076b5ba6558e9f2b576109dac877ad401533da10c"} Apr 17 11:15:39.729493 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.729458 2579 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 11:15:39.739802 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.739755 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/feeaed08-a6f0-498e-827c-56a07f3c55d7-sys-fs\") pod \"aws-ebs-csi-driver-node-mrrv5\" (UID: \"feeaed08-a6f0-498e-827c-56a07f3c55d7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrrv5" Apr 17 11:15:39.739960 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.739841 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-cnibin\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.739960 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.739887 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/389cf577-bd02-4903-96d9-cdc3fd99d418-cni-binary-copy\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.739960 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.739911 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dbkfv\" (UniqueName: \"kubernetes.io/projected/7920eeb8-72c9-4fe5-aff5-30f78ed7f840-kube-api-access-dbkfv\") pod \"node-resolver-jrxcq\" (UID: \"7920eeb8-72c9-4fe5-aff5-30f78ed7f840\") " pod="openshift-dns/node-resolver-jrxcq" Apr 17 11:15:39.739960 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.739915 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-cnibin\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.739960 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.739937 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/841bd702-cde2-4bf5-9789-aa664c501f8f-os-release\") pod \"multus-additional-cni-plugins-dvnsm\" (UID: \"841bd702-cde2-4bf5-9789-aa664c501f8f\") " pod="openshift-multus/multus-additional-cni-plugins-dvnsm" Apr 17 11:15:39.740219 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.739964 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/841bd702-cde2-4bf5-9789-aa664c501f8f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dvnsm\" (UID: \"841bd702-cde2-4bf5-9789-aa664c501f8f\") " pod="openshift-multus/multus-additional-cni-plugins-dvnsm" Apr 17 11:15:39.740219 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.739989 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bed77416-eb94-411f-885d-cc01490e88a0-etc-tuned\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.740219 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.740011 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bed77416-eb94-411f-885d-cc01490e88a0-etc-sysconfig\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.740219 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.740038 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lfdz4\" (UniqueName: \"kubernetes.io/projected/bfa20876-9d47-42bf-aad5-24503e05b86e-kube-api-access-lfdz4\") pod \"network-check-target-47nt5\" (UID: \"bfa20876-9d47-42bf-aad5-24503e05b86e\") " pod="openshift-network-diagnostics/network-check-target-47nt5" Apr 17 11:15:39.740219 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.740065 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5q65g\" (UniqueName: \"kubernetes.io/projected/ec49f65a-cac1-4bb8-8dd5-f77b34ef2282-kube-api-access-5q65g\") pod \"iptables-alerter-t9jv4\" (UID: \"ec49f65a-cac1-4bb8-8dd5-f77b34ef2282\") " pod="openshift-network-operator/iptables-alerter-t9jv4" Apr 17 11:15:39.740219 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.740095 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/841bd702-cde2-4bf5-9789-aa664c501f8f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dvnsm\" (UID: \"841bd702-cde2-4bf5-9789-aa664c501f8f\") " pod="openshift-multus/multus-additional-cni-plugins-dvnsm" Apr 17 11:15:39.740219 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.740121 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/841bd702-cde2-4bf5-9789-aa664c501f8f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-dvnsm\" (UID: \"841bd702-cde2-4bf5-9789-aa664c501f8f\") " pod="openshift-multus/multus-additional-cni-plugins-dvnsm" Apr 17 11:15:39.740219 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.740148 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdzqd\" (UniqueName: \"kubernetes.io/projected/0a2631b2-add8-43b1-a9b4-b872018c7373-kube-api-access-vdzqd\") pod \"node-ca-2678k\" (UID: \"0a2631b2-add8-43b1-a9b4-b872018c7373\") " pod="openshift-image-registry/node-ca-2678k" Apr 17 11:15:39.740219 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.740170 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bed77416-eb94-411f-885d-cc01490e88a0-sys\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.740219 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.740193 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bed77416-eb94-411f-885d-cc01490e88a0-etc-sysctl-d\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.740219 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.740218 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bed77416-eb94-411f-885d-cc01490e88a0-var-lib-kubelet\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.740710 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.740240 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0a2631b2-add8-43b1-a9b4-b872018c7373-serviceca\") pod \"node-ca-2678k\" (UID: \"0a2631b2-add8-43b1-a9b4-b872018c7373\") " pod="openshift-image-registry/node-ca-2678k" Apr 17 11:15:39.740710 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.740265 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-host-slash\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.740710 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.740290 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j96dq\" (UniqueName: \"kubernetes.io/projected/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-kube-api-access-j96dq\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.740710 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.740317 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/feeaed08-a6f0-498e-827c-56a07f3c55d7-registration-dir\") pod \"aws-ebs-csi-driver-node-mrrv5\" (UID: \"feeaed08-a6f0-498e-827c-56a07f3c55d7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrrv5" Apr 17 11:15:39.740710 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.740342 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq8rs\" (UniqueName: \"kubernetes.io/projected/feeaed08-a6f0-498e-827c-56a07f3c55d7-kube-api-access-qq8rs\") pod \"aws-ebs-csi-driver-node-mrrv5\" (UID: \"feeaed08-a6f0-498e-827c-56a07f3c55d7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrrv5" Apr 17 11:15:39.740710 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.740368 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-multus-cni-dir\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.740710 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.740413 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bed77416-eb94-411f-885d-cc01490e88a0-etc-kubernetes\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.740710 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.740453 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5wnb\" (UniqueName: \"kubernetes.io/projected/bed77416-eb94-411f-885d-cc01490e88a0-kube-api-access-r5wnb\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.740710 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.740478 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-host-run-netns\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.740710 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.740508 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-run-ovn\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.740710 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.740519 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/389cf577-bd02-4903-96d9-cdc3fd99d418-cni-binary-copy\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.740710 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.740533 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.740710 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.740575 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-env-overrides\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.740710 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.740621 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5652250a-f654-49f4-a3fa-82e77fa0b777-konnectivity-ca\") pod \"konnectivity-agent-46jrw\" (UID: \"5652250a-f654-49f4-a3fa-82e77fa0b777\") " pod="kube-system/konnectivity-agent-46jrw" Apr 17 11:15:39.740710 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.740702 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-hostroot\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.741336 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.740728 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-multus-cni-dir\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.741336 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.740730 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7920eeb8-72c9-4fe5-aff5-30f78ed7f840-hosts-file\") pod \"node-resolver-jrxcq\" (UID: \"7920eeb8-72c9-4fe5-aff5-30f78ed7f840\") " pod="openshift-dns/node-resolver-jrxcq" Apr 17 11:15:39.741336 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.740788 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/841bd702-cde2-4bf5-9789-aa664c501f8f-system-cni-dir\") pod \"multus-additional-cni-plugins-dvnsm\" (UID: \"841bd702-cde2-4bf5-9789-aa664c501f8f\") " pod="openshift-multus/multus-additional-cni-plugins-dvnsm" Apr 17 11:15:39.741336 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.740793 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7920eeb8-72c9-4fe5-aff5-30f78ed7f840-hosts-file\") pod \"node-resolver-jrxcq\" (UID: \"7920eeb8-72c9-4fe5-aff5-30f78ed7f840\") " pod="openshift-dns/node-resolver-jrxcq" Apr 17 11:15:39.741336 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.740816 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-node-log\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.741336 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.740840 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-hostroot\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.741336 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.740846 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-host-run-ovn-kubernetes\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.741336 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.740875 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/841bd702-cde2-4bf5-9789-aa664c501f8f-cni-binary-copy\") pod \"multus-additional-cni-plugins-dvnsm\" (UID: \"841bd702-cde2-4bf5-9789-aa664c501f8f\") " pod="openshift-multus/multus-additional-cni-plugins-dvnsm" Apr 17 11:15:39.741336 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.740900 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-host-cni-bin\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.741336 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.740937 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bed77416-eb94-411f-885d-cc01490e88a0-run\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.741336 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.740970 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bed77416-eb94-411f-885d-cc01490e88a0-host\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.741336 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.741001 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-run-systemd\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.741336 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.741027 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-host-run-netns\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.741336 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.741059 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-systemd-units\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.741336 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.741090 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-var-lib-openvswitch\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.741336 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.741132 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-host-run-netns\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.741336 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.741140 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-ovnkube-script-lib\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.742027 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.741168 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-multus-conf-dir\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.742027 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.741200 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/389cf577-bd02-4903-96d9-cdc3fd99d418-multus-daemon-config\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.742027 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.741226 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7920eeb8-72c9-4fe5-aff5-30f78ed7f840-tmp-dir\") pod \"node-resolver-jrxcq\" (UID: \"7920eeb8-72c9-4fe5-aff5-30f78ed7f840\") " pod="openshift-dns/node-resolver-jrxcq" Apr 17 11:15:39.742027 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.741262 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7dpw\" (UniqueName: \"kubernetes.io/projected/841bd702-cde2-4bf5-9789-aa664c501f8f-kube-api-access-p7dpw\") pod \"multus-additional-cni-plugins-dvnsm\" (UID: \"841bd702-cde2-4bf5-9789-aa664c501f8f\") " pod="openshift-multus/multus-additional-cni-plugins-dvnsm" Apr 17 11:15:39.742027 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.741299 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-multus-conf-dir\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.742027 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.741399 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bed77416-eb94-411f-885d-cc01490e88a0-etc-sysctl-conf\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.742027 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.741432 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-log-socket\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.742027 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.741460 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5652250a-f654-49f4-a3fa-82e77fa0b777-agent-certs\") pod \"konnectivity-agent-46jrw\" (UID: \"5652250a-f654-49f4-a3fa-82e77fa0b777\") " pod="kube-system/konnectivity-agent-46jrw" Apr 17 11:15:39.742027 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.741485 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/841bd702-cde2-4bf5-9789-aa664c501f8f-cnibin\") pod \"multus-additional-cni-plugins-dvnsm\" (UID: \"841bd702-cde2-4bf5-9789-aa664c501f8f\") " pod="openshift-multus/multus-additional-cni-plugins-dvnsm" Apr 17 11:15:39.742027 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.741508 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-host-kubelet\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.742027 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.741532 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-ovnkube-config\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.742027 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.741557 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/feeaed08-a6f0-498e-827c-56a07f3c55d7-socket-dir\") pod \"aws-ebs-csi-driver-node-mrrv5\" (UID: \"feeaed08-a6f0-498e-827c-56a07f3c55d7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrrv5" Apr 17 11:15:39.742027 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.741593 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-system-cni-dir\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.742027 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.741611 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7920eeb8-72c9-4fe5-aff5-30f78ed7f840-tmp-dir\") pod \"node-resolver-jrxcq\" (UID: \"7920eeb8-72c9-4fe5-aff5-30f78ed7f840\") " pod="openshift-dns/node-resolver-jrxcq" Apr 17 11:15:39.742027 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.741648 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-multus-socket-dir-parent\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.742027 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.741658 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-system-cni-dir\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.742027 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.741686 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-45fjw\" (UniqueName: \"kubernetes.io/projected/0ba74b24-e523-481e-82b5-080dc7ecb2e2-kube-api-access-45fjw\") pod \"network-metrics-daemon-9g7pq\" (UID: \"0ba74b24-e523-481e-82b5-080dc7ecb2e2\") " pod="openshift-multus/network-metrics-daemon-9g7pq" Apr 17 11:15:39.742725 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.741712 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ec49f65a-cac1-4bb8-8dd5-f77b34ef2282-host-slash\") pod \"iptables-alerter-t9jv4\" (UID: \"ec49f65a-cac1-4bb8-8dd5-f77b34ef2282\") " pod="openshift-network-operator/iptables-alerter-t9jv4" Apr 17 11:15:39.742725 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.741718 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/389cf577-bd02-4903-96d9-cdc3fd99d418-multus-daemon-config\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.742725 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.741720 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-multus-socket-dir-parent\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.742725 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.741754 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ec49f65a-cac1-4bb8-8dd5-f77b34ef2282-host-slash\") pod \"iptables-alerter-t9jv4\" (UID: \"ec49f65a-cac1-4bb8-8dd5-f77b34ef2282\") " pod="openshift-network-operator/iptables-alerter-t9jv4" Apr 17 11:15:39.742725 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.741876 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-etc-openvswitch\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.742725 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.741927 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/feeaed08-a6f0-498e-827c-56a07f3c55d7-device-dir\") pod \"aws-ebs-csi-driver-node-mrrv5\" (UID: \"feeaed08-a6f0-498e-827c-56a07f3c55d7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrrv5" Apr 17 11:15:39.742725 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.741959 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-os-release\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.742725 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.742019 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-host-var-lib-cni-multus\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.742725 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.742029 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-os-release\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.742725 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.742049 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-host-var-lib-cni-multus\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.742725 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.742069 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-host-var-lib-kubelet\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.742725 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.742095 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-host-run-multus-certs\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.742725 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.742121 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0a2631b2-add8-43b1-a9b4-b872018c7373-host\") pod \"node-ca-2678k\" (UID: \"0a2631b2-add8-43b1-a9b4-b872018c7373\") " pod="openshift-image-registry/node-ca-2678k" Apr 17 11:15:39.742725 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.742194 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-host-var-lib-kubelet\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.742725 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.742236 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-host-run-k8s-cni-cncf-io\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.742725 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.742268 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-host-run-multus-certs\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.742725 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.742276 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2dw25\" (UniqueName: \"kubernetes.io/projected/389cf577-bd02-4903-96d9-cdc3fd99d418-kube-api-access-2dw25\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.742725 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.742295 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-host-run-k8s-cni-cncf-io\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.743345 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.742307 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ec49f65a-cac1-4bb8-8dd5-f77b34ef2282-iptables-alerter-script\") pod \"iptables-alerter-t9jv4\" (UID: \"ec49f65a-cac1-4bb8-8dd5-f77b34ef2282\") " pod="openshift-network-operator/iptables-alerter-t9jv4" Apr 17 11:15:39.743345 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.742359 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bed77416-eb94-411f-885d-cc01490e88a0-etc-systemd\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.743345 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.742383 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/feeaed08-a6f0-498e-827c-56a07f3c55d7-etc-selinux\") pod \"aws-ebs-csi-driver-node-mrrv5\" (UID: \"feeaed08-a6f0-498e-827c-56a07f3c55d7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrrv5" Apr 17 11:15:39.743345 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.742412 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-etc-kubernetes\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.743345 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.742437 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0ba74b24-e523-481e-82b5-080dc7ecb2e2-metrics-certs\") pod \"network-metrics-daemon-9g7pq\" (UID: \"0ba74b24-e523-481e-82b5-080dc7ecb2e2\") " pod="openshift-multus/network-metrics-daemon-9g7pq" Apr 17 11:15:39.743345 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.742491 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-etc-kubernetes\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.743345 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:39.742527 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:15:39.743345 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.742536 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bed77416-eb94-411f-885d-cc01490e88a0-etc-modprobe-d\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.743345 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.742568 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-run-openvswitch\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.743345 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:39.742615 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ba74b24-e523-481e-82b5-080dc7ecb2e2-metrics-certs podName:0ba74b24-e523-481e-82b5-080dc7ecb2e2 nodeName:}" failed. No retries permitted until 2026-04-17 11:15:40.242582795 +0000 UTC m=+3.109106941 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0ba74b24-e523-481e-82b5-080dc7ecb2e2-metrics-certs") pod "network-metrics-daemon-9g7pq" (UID: "0ba74b24-e523-481e-82b5-080dc7ecb2e2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:15:39.743345 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.742638 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-ovn-node-metrics-cert\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.743345 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.742670 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-host-var-lib-cni-bin\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.743345 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.742697 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bed77416-eb94-411f-885d-cc01490e88a0-lib-modules\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.743345 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.742721 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bed77416-eb94-411f-885d-cc01490e88a0-tmp\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.743345 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.742757 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ec49f65a-cac1-4bb8-8dd5-f77b34ef2282-iptables-alerter-script\") pod \"iptables-alerter-t9jv4\" (UID: \"ec49f65a-cac1-4bb8-8dd5-f77b34ef2282\") " pod="openshift-network-operator/iptables-alerter-t9jv4" Apr 17 11:15:39.743345 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.742761 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/389cf577-bd02-4903-96d9-cdc3fd99d418-host-var-lib-cni-bin\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.743345 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.742782 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-host-cni-netd\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.743926 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.742818 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/feeaed08-a6f0-498e-827c-56a07f3c55d7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mrrv5\" (UID: \"feeaed08-a6f0-498e-827c-56a07f3c55d7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrrv5" Apr 17 11:15:39.745628 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.745599 2579 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 11:15:39.746018 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:39.745984 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:15:39.746018 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:39.746008 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:15:39.746018 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:39.746021 2579 projected.go:194] Error preparing data for projected volume kube-api-access-lfdz4 for pod openshift-network-diagnostics/network-check-target-47nt5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:15:39.746230 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:39.746088 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bfa20876-9d47-42bf-aad5-24503e05b86e-kube-api-access-lfdz4 podName:bfa20876-9d47-42bf-aad5-24503e05b86e nodeName:}" failed. No retries permitted until 2026-04-17 11:15:40.246072665 +0000 UTC m=+3.112596797 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-lfdz4" (UniqueName: "kubernetes.io/projected/bfa20876-9d47-42bf-aad5-24503e05b86e-kube-api-access-lfdz4") pod "network-check-target-47nt5" (UID: "bfa20876-9d47-42bf-aad5-24503e05b86e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:15:39.749211 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.749189 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q65g\" (UniqueName: \"kubernetes.io/projected/ec49f65a-cac1-4bb8-8dd5-f77b34ef2282-kube-api-access-5q65g\") pod \"iptables-alerter-t9jv4\" (UID: \"ec49f65a-cac1-4bb8-8dd5-f77b34ef2282\") " pod="openshift-network-operator/iptables-alerter-t9jv4" Apr 17 11:15:39.749315 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.749190 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-45fjw\" (UniqueName: \"kubernetes.io/projected/0ba74b24-e523-481e-82b5-080dc7ecb2e2-kube-api-access-45fjw\") pod \"network-metrics-daemon-9g7pq\" (UID: \"0ba74b24-e523-481e-82b5-080dc7ecb2e2\") " pod="openshift-multus/network-metrics-daemon-9g7pq" Apr 17 11:15:39.749600 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.749577 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dw25\" (UniqueName: \"kubernetes.io/projected/389cf577-bd02-4903-96d9-cdc3fd99d418-kube-api-access-2dw25\") pod \"multus-86dh5\" (UID: \"389cf577-bd02-4903-96d9-cdc3fd99d418\") " pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.749703 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.749586 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbkfv\" (UniqueName: \"kubernetes.io/projected/7920eeb8-72c9-4fe5-aff5-30f78ed7f840-kube-api-access-dbkfv\") pod \"node-resolver-jrxcq\" (UID: \"7920eeb8-72c9-4fe5-aff5-30f78ed7f840\") " pod="openshift-dns/node-resolver-jrxcq" Apr 17 11:15:39.843719 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.843648 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bed77416-eb94-411f-885d-cc01490e88a0-run\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.843719 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.843683 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bed77416-eb94-411f-885d-cc01490e88a0-host\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.843719 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.843709 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-run-systemd\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.843925 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.843742 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-systemd-units\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.843925 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.843781 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-var-lib-openvswitch\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.843925 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.843783 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bed77416-eb94-411f-885d-cc01490e88a0-run\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.843925 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.843786 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bed77416-eb94-411f-885d-cc01490e88a0-host\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.843925 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.843807 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-ovnkube-script-lib\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.843925 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.843837 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-systemd-units\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.843925 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.843841 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-run-systemd\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.843925 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.843881 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p7dpw\" (UniqueName: \"kubernetes.io/projected/841bd702-cde2-4bf5-9789-aa664c501f8f-kube-api-access-p7dpw\") pod \"multus-additional-cni-plugins-dvnsm\" (UID: \"841bd702-cde2-4bf5-9789-aa664c501f8f\") " pod="openshift-multus/multus-additional-cni-plugins-dvnsm" Apr 17 11:15:39.843925 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.843887 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-var-lib-openvswitch\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.844225 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.843934 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bed77416-eb94-411f-885d-cc01490e88a0-etc-sysctl-conf\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.844225 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.843962 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-log-socket\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.844225 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.843987 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5652250a-f654-49f4-a3fa-82e77fa0b777-agent-certs\") pod \"konnectivity-agent-46jrw\" (UID: \"5652250a-f654-49f4-a3fa-82e77fa0b777\") " pod="kube-system/konnectivity-agent-46jrw" Apr 17 11:15:39.844225 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.844008 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/841bd702-cde2-4bf5-9789-aa664c501f8f-cnibin\") pod \"multus-additional-cni-plugins-dvnsm\" (UID: \"841bd702-cde2-4bf5-9789-aa664c501f8f\") " pod="openshift-multus/multus-additional-cni-plugins-dvnsm" Apr 17 11:15:39.844225 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.844033 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-host-kubelet\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.844225 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.844034 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bed77416-eb94-411f-885d-cc01490e88a0-etc-sysctl-conf\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.844225 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.844060 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-log-socket\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.844225 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.844096 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/841bd702-cde2-4bf5-9789-aa664c501f8f-cnibin\") pod \"multus-additional-cni-plugins-dvnsm\" (UID: \"841bd702-cde2-4bf5-9789-aa664c501f8f\") " pod="openshift-multus/multus-additional-cni-plugins-dvnsm" Apr 17 11:15:39.844225 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.844117 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-ovnkube-config\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.844225 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.844199 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/feeaed08-a6f0-498e-827c-56a07f3c55d7-socket-dir\") pod \"aws-ebs-csi-driver-node-mrrv5\" (UID: \"feeaed08-a6f0-498e-827c-56a07f3c55d7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrrv5" Apr 17 11:15:39.844764 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.844229 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-etc-openvswitch\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.844764 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.844268 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/feeaed08-a6f0-498e-827c-56a07f3c55d7-device-dir\") pod \"aws-ebs-csi-driver-node-mrrv5\" (UID: \"feeaed08-a6f0-498e-827c-56a07f3c55d7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrrv5" Apr 17 11:15:39.844764 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.844308 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-host-kubelet\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.844764 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.844314 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0a2631b2-add8-43b1-a9b4-b872018c7373-host\") pod \"node-ca-2678k\" (UID: \"0a2631b2-add8-43b1-a9b4-b872018c7373\") " pod="openshift-image-registry/node-ca-2678k" Apr 17 11:15:39.844764 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.844349 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0a2631b2-add8-43b1-a9b4-b872018c7373-host\") pod \"node-ca-2678k\" (UID: \"0a2631b2-add8-43b1-a9b4-b872018c7373\") " pod="openshift-image-registry/node-ca-2678k" Apr 17 11:15:39.844764 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.844364 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/feeaed08-a6f0-498e-827c-56a07f3c55d7-socket-dir\") pod \"aws-ebs-csi-driver-node-mrrv5\" (UID: \"feeaed08-a6f0-498e-827c-56a07f3c55d7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrrv5" Apr 17 11:15:39.844764 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.844372 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bed77416-eb94-411f-885d-cc01490e88a0-etc-systemd\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.844764 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.844388 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-etc-openvswitch\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.844764 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.844429 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bed77416-eb94-411f-885d-cc01490e88a0-etc-systemd\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.844764 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.844428 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/feeaed08-a6f0-498e-827c-56a07f3c55d7-etc-selinux\") pod \"aws-ebs-csi-driver-node-mrrv5\" (UID: \"feeaed08-a6f0-498e-827c-56a07f3c55d7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrrv5" Apr 17 11:15:39.844764 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.844443 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-ovnkube-script-lib\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.844764 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.844465 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/feeaed08-a6f0-498e-827c-56a07f3c55d7-device-dir\") pod \"aws-ebs-csi-driver-node-mrrv5\" (UID: \"feeaed08-a6f0-498e-827c-56a07f3c55d7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrrv5" Apr 17 11:15:39.844764 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.844487 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bed77416-eb94-411f-885d-cc01490e88a0-etc-modprobe-d\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.844764 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.844521 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/feeaed08-a6f0-498e-827c-56a07f3c55d7-etc-selinux\") pod \"aws-ebs-csi-driver-node-mrrv5\" (UID: \"feeaed08-a6f0-498e-827c-56a07f3c55d7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrrv5" Apr 17 11:15:39.844764 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.844522 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-run-openvswitch\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.844764 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.844566 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-run-openvswitch\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.844764 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.844605 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-ovn-node-metrics-cert\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.845374 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.844611 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bed77416-eb94-411f-885d-cc01490e88a0-etc-modprobe-d\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.845374 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.844625 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-ovnkube-config\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.845374 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.844632 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bed77416-eb94-411f-885d-cc01490e88a0-lib-modules\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.845374 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.844655 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bed77416-eb94-411f-885d-cc01490e88a0-tmp\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.845374 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.844678 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-host-cni-netd\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.845374 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.844701 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/feeaed08-a6f0-498e-827c-56a07f3c55d7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mrrv5\" (UID: \"feeaed08-a6f0-498e-827c-56a07f3c55d7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrrv5" Apr 17 11:15:39.845374 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.844752 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bed77416-eb94-411f-885d-cc01490e88a0-lib-modules\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.845374 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.844816 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-host-cni-netd\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.845713 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.845578 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/feeaed08-a6f0-498e-827c-56a07f3c55d7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mrrv5\" (UID: \"feeaed08-a6f0-498e-827c-56a07f3c55d7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrrv5" Apr 17 11:15:39.845845 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.845816 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/feeaed08-a6f0-498e-827c-56a07f3c55d7-sys-fs\") pod \"aws-ebs-csi-driver-node-mrrv5\" (UID: \"feeaed08-a6f0-498e-827c-56a07f3c55d7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrrv5" Apr 17 11:15:39.845982 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.845881 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/841bd702-cde2-4bf5-9789-aa664c501f8f-os-release\") pod \"multus-additional-cni-plugins-dvnsm\" (UID: \"841bd702-cde2-4bf5-9789-aa664c501f8f\") " pod="openshift-multus/multus-additional-cni-plugins-dvnsm" Apr 17 11:15:39.845982 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.845911 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/841bd702-cde2-4bf5-9789-aa664c501f8f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dvnsm\" (UID: \"841bd702-cde2-4bf5-9789-aa664c501f8f\") " pod="openshift-multus/multus-additional-cni-plugins-dvnsm" Apr 17 11:15:39.845982 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.845953 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bed77416-eb94-411f-885d-cc01490e88a0-etc-tuned\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.845982 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.845983 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bed77416-eb94-411f-885d-cc01490e88a0-etc-sysconfig\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.846187 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.846027 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/841bd702-cde2-4bf5-9789-aa664c501f8f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dvnsm\" (UID: \"841bd702-cde2-4bf5-9789-aa664c501f8f\") " pod="openshift-multus/multus-additional-cni-plugins-dvnsm" Apr 17 11:15:39.846187 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.846061 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/841bd702-cde2-4bf5-9789-aa664c501f8f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-dvnsm\" (UID: \"841bd702-cde2-4bf5-9789-aa664c501f8f\") " pod="openshift-multus/multus-additional-cni-plugins-dvnsm" Apr 17 11:15:39.846187 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.846092 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vdzqd\" (UniqueName: \"kubernetes.io/projected/0a2631b2-add8-43b1-a9b4-b872018c7373-kube-api-access-vdzqd\") pod \"node-ca-2678k\" (UID: \"0a2631b2-add8-43b1-a9b4-b872018c7373\") " pod="openshift-image-registry/node-ca-2678k" Apr 17 11:15:39.846460 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.846440 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/841bd702-cde2-4bf5-9789-aa664c501f8f-os-release\") pod \"multus-additional-cni-plugins-dvnsm\" (UID: \"841bd702-cde2-4bf5-9789-aa664c501f8f\") " pod="openshift-multus/multus-additional-cni-plugins-dvnsm" Apr 17 11:15:39.846663 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.846642 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bed77416-eb94-411f-885d-cc01490e88a0-sys\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.846778 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.846748 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/feeaed08-a6f0-498e-827c-56a07f3c55d7-sys-fs\") pod \"aws-ebs-csi-driver-node-mrrv5\" (UID: \"feeaed08-a6f0-498e-827c-56a07f3c55d7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrrv5" Apr 17 11:15:39.846855 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.846827 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bed77416-eb94-411f-885d-cc01490e88a0-etc-sysconfig\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.847311 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.847276 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5652250a-f654-49f4-a3fa-82e77fa0b777-agent-certs\") pod \"konnectivity-agent-46jrw\" (UID: \"5652250a-f654-49f4-a3fa-82e77fa0b777\") " pod="kube-system/konnectivity-agent-46jrw" Apr 17 11:15:39.847391 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.847326 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/841bd702-cde2-4bf5-9789-aa664c501f8f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dvnsm\" (UID: \"841bd702-cde2-4bf5-9789-aa664c501f8f\") " pod="openshift-multus/multus-additional-cni-plugins-dvnsm" Apr 17 11:15:39.848024 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.847454 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/841bd702-cde2-4bf5-9789-aa664c501f8f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dvnsm\" (UID: \"841bd702-cde2-4bf5-9789-aa664c501f8f\") " pod="openshift-multus/multus-additional-cni-plugins-dvnsm" Apr 17 11:15:39.848024 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.847527 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bed77416-eb94-411f-885d-cc01490e88a0-etc-sysctl-d\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.848024 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.847560 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bed77416-eb94-411f-885d-cc01490e88a0-var-lib-kubelet\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.848024 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.847627 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0a2631b2-add8-43b1-a9b4-b872018c7373-serviceca\") pod \"node-ca-2678k\" (UID: \"0a2631b2-add8-43b1-a9b4-b872018c7373\") " pod="openshift-image-registry/node-ca-2678k" Apr 17 11:15:39.848024 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.847658 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-host-slash\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.848024 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.847685 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j96dq\" (UniqueName: \"kubernetes.io/projected/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-kube-api-access-j96dq\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.848024 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.847728 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/feeaed08-a6f0-498e-827c-56a07f3c55d7-registration-dir\") pod \"aws-ebs-csi-driver-node-mrrv5\" (UID: \"feeaed08-a6f0-498e-827c-56a07f3c55d7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrrv5" Apr 17 11:15:39.848024 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.847732 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bed77416-eb94-411f-885d-cc01490e88a0-tmp\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.848024 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.847804 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/841bd702-cde2-4bf5-9789-aa664c501f8f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-dvnsm\" (UID: \"841bd702-cde2-4bf5-9789-aa664c501f8f\") " pod="openshift-multus/multus-additional-cni-plugins-dvnsm" Apr 17 11:15:39.848024 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.847842 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qq8rs\" (UniqueName: \"kubernetes.io/projected/feeaed08-a6f0-498e-827c-56a07f3c55d7-kube-api-access-qq8rs\") pod \"aws-ebs-csi-driver-node-mrrv5\" (UID: \"feeaed08-a6f0-498e-827c-56a07f3c55d7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrrv5" Apr 17 11:15:39.848024 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.847840 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bed77416-eb94-411f-885d-cc01490e88a0-sys\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.848024 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.847887 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bed77416-eb94-411f-885d-cc01490e88a0-etc-kubernetes\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.848024 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.847905 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/feeaed08-a6f0-498e-827c-56a07f3c55d7-registration-dir\") pod \"aws-ebs-csi-driver-node-mrrv5\" (UID: \"feeaed08-a6f0-498e-827c-56a07f3c55d7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrrv5" Apr 17 11:15:39.848024 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.847924 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5wnb\" (UniqueName: \"kubernetes.io/projected/bed77416-eb94-411f-885d-cc01490e88a0-kube-api-access-r5wnb\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.848024 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.847927 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bed77416-eb94-411f-885d-cc01490e88a0-etc-sysctl-d\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.848024 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.847955 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bed77416-eb94-411f-885d-cc01490e88a0-var-lib-kubelet\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.848024 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.848016 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-host-run-netns\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.848806 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.848049 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-run-ovn\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.848806 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.848075 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.848806 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.848083 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bed77416-eb94-411f-885d-cc01490e88a0-etc-kubernetes\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.848806 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.848106 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-env-overrides\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.848806 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.848443 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5652250a-f654-49f4-a3fa-82e77fa0b777-konnectivity-ca\") pod \"konnectivity-agent-46jrw\" (UID: \"5652250a-f654-49f4-a3fa-82e77fa0b777\") " pod="kube-system/konnectivity-agent-46jrw" Apr 17 11:15:39.848806 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.848483 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/841bd702-cde2-4bf5-9789-aa664c501f8f-system-cni-dir\") pod \"multus-additional-cni-plugins-dvnsm\" (UID: \"841bd702-cde2-4bf5-9789-aa664c501f8f\") " pod="openshift-multus/multus-additional-cni-plugins-dvnsm" Apr 17 11:15:39.848806 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.848486 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-run-ovn\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.848806 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.848514 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-node-log\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.848806 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.848539 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-host-run-ovn-kubernetes\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.848806 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.848570 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/841bd702-cde2-4bf5-9789-aa664c501f8f-cni-binary-copy\") pod \"multus-additional-cni-plugins-dvnsm\" (UID: \"841bd702-cde2-4bf5-9789-aa664c501f8f\") " pod="openshift-multus/multus-additional-cni-plugins-dvnsm" Apr 17 11:15:39.849220 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.848868 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-host-cni-bin\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.849220 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.848915 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-env-overrides\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.849220 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.848949 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-host-cni-bin\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.849220 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.849193 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-host-run-netns\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.849469 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.849434 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-host-slash\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.850944 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.849609 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/841bd702-cde2-4bf5-9789-aa664c501f8f-system-cni-dir\") pod \"multus-additional-cni-plugins-dvnsm\" (UID: \"841bd702-cde2-4bf5-9789-aa664c501f8f\") " pod="openshift-multus/multus-additional-cni-plugins-dvnsm" Apr 17 11:15:39.850944 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.849737 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0a2631b2-add8-43b1-a9b4-b872018c7373-serviceca\") pod \"node-ca-2678k\" (UID: \"0a2631b2-add8-43b1-a9b4-b872018c7373\") " pod="openshift-image-registry/node-ca-2678k" Apr 17 11:15:39.850944 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.849829 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.850944 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.849886 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-node-log\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.850944 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.849917 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-host-run-ovn-kubernetes\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.850944 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.850005 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bed77416-eb94-411f-885d-cc01490e88a0-etc-tuned\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.850944 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.850157 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5652250a-f654-49f4-a3fa-82e77fa0b777-konnectivity-ca\") pod \"konnectivity-agent-46jrw\" (UID: \"5652250a-f654-49f4-a3fa-82e77fa0b777\") " pod="kube-system/konnectivity-agent-46jrw" Apr 17 11:15:39.850944 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.850867 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/841bd702-cde2-4bf5-9789-aa664c501f8f-cni-binary-copy\") pod \"multus-additional-cni-plugins-dvnsm\" (UID: \"841bd702-cde2-4bf5-9789-aa664c501f8f\") " pod="openshift-multus/multus-additional-cni-plugins-dvnsm" Apr 17 11:15:39.851827 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.851803 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-ovn-node-metrics-cert\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.852484 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.852466 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7dpw\" (UniqueName: \"kubernetes.io/projected/841bd702-cde2-4bf5-9789-aa664c501f8f-kube-api-access-p7dpw\") pod \"multus-additional-cni-plugins-dvnsm\" (UID: \"841bd702-cde2-4bf5-9789-aa664c501f8f\") " pod="openshift-multus/multus-additional-cni-plugins-dvnsm" Apr 17 11:15:39.853745 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.853726 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdzqd\" (UniqueName: \"kubernetes.io/projected/0a2631b2-add8-43b1-a9b4-b872018c7373-kube-api-access-vdzqd\") pod \"node-ca-2678k\" (UID: \"0a2631b2-add8-43b1-a9b4-b872018c7373\") " pod="openshift-image-registry/node-ca-2678k" Apr 17 11:15:39.855580 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.855559 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5wnb\" (UniqueName: \"kubernetes.io/projected/bed77416-eb94-411f-885d-cc01490e88a0-kube-api-access-r5wnb\") pod \"tuned-zc9p4\" (UID: \"bed77416-eb94-411f-885d-cc01490e88a0\") " pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.855869 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.855853 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq8rs\" (UniqueName: \"kubernetes.io/projected/feeaed08-a6f0-498e-827c-56a07f3c55d7-kube-api-access-qq8rs\") pod \"aws-ebs-csi-driver-node-mrrv5\" (UID: \"feeaed08-a6f0-498e-827c-56a07f3c55d7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrrv5" Apr 17 11:15:39.856129 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.856113 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j96dq\" (UniqueName: \"kubernetes.io/projected/431e03f9-9af4-4fa7-8f47-c50f52e2a7e5-kube-api-access-j96dq\") pod \"ovnkube-node-lxcn4\" (UID: \"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.935201 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.935159 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-86dh5" Apr 17 11:15:39.943988 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.943965 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-t9jv4" Apr 17 11:15:39.951721 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.951700 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jrxcq" Apr 17 11:15:39.957280 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.957262 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dvnsm" Apr 17 11:15:39.966838 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.966821 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2678k" Apr 17 11:15:39.972399 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.972379 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" Apr 17 11:15:39.980013 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.979994 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:15:39.986581 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.986556 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-46jrw" Apr 17 11:15:39.992261 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:39.992238 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrrv5" Apr 17 11:15:40.251246 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:40.251205 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lfdz4\" (UniqueName: \"kubernetes.io/projected/bfa20876-9d47-42bf-aad5-24503e05b86e-kube-api-access-lfdz4\") pod \"network-check-target-47nt5\" (UID: \"bfa20876-9d47-42bf-aad5-24503e05b86e\") " pod="openshift-network-diagnostics/network-check-target-47nt5" Apr 17 11:15:40.251420 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:40.251307 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0ba74b24-e523-481e-82b5-080dc7ecb2e2-metrics-certs\") pod \"network-metrics-daemon-9g7pq\" (UID: \"0ba74b24-e523-481e-82b5-080dc7ecb2e2\") " pod="openshift-multus/network-metrics-daemon-9g7pq" Apr 17 11:15:40.251420 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:40.251382 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:15:40.251420 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:40.251401 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:15:40.251569 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:40.251457 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ba74b24-e523-481e-82b5-080dc7ecb2e2-metrics-certs podName:0ba74b24-e523-481e-82b5-080dc7ecb2e2 nodeName:}" failed. No retries permitted until 2026-04-17 11:15:41.25143836 +0000 UTC m=+4.117962494 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0ba74b24-e523-481e-82b5-080dc7ecb2e2-metrics-certs") pod "network-metrics-daemon-9g7pq" (UID: "0ba74b24-e523-481e-82b5-080dc7ecb2e2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:15:40.251569 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:40.251403 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:15:40.251569 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:40.251514 2579 projected.go:194] Error preparing data for projected volume kube-api-access-lfdz4 for pod openshift-network-diagnostics/network-check-target-47nt5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:15:40.251717 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:40.251576 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bfa20876-9d47-42bf-aad5-24503e05b86e-kube-api-access-lfdz4 podName:bfa20876-9d47-42bf-aad5-24503e05b86e nodeName:}" failed. No retries permitted until 2026-04-17 11:15:41.251558732 +0000 UTC m=+4.118082878 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-lfdz4" (UniqueName: "kubernetes.io/projected/bfa20876-9d47-42bf-aad5-24503e05b86e-kube-api-access-lfdz4") pod "network-check-target-47nt5" (UID: "bfa20876-9d47-42bf-aad5-24503e05b86e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:15:40.360195 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:40.360162 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5652250a_f654_49f4_a3fa_82e77fa0b777.slice/crio-38711b229bb04ec70d22e653effe7680cf9be0a6f82c4e8762abfe65b1ad7491 WatchSource:0}: Error finding container 38711b229bb04ec70d22e653effe7680cf9be0a6f82c4e8762abfe65b1ad7491: Status 404 returned error can't find the container with id 38711b229bb04ec70d22e653effe7680cf9be0a6f82c4e8762abfe65b1ad7491 Apr 17 11:15:40.361232 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:40.361148 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod389cf577_bd02_4903_96d9_cdc3fd99d418.slice/crio-429c79db0e076cfaa8e436f7e725ac2e4707aa3871bf11b7d6ae5f49c63da28b WatchSource:0}: Error finding container 429c79db0e076cfaa8e436f7e725ac2e4707aa3871bf11b7d6ae5f49c63da28b: Status 404 returned error can't find the container with id 429c79db0e076cfaa8e436f7e725ac2e4707aa3871bf11b7d6ae5f49c63da28b Apr 17 11:15:40.363161 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:40.363034 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbed77416_eb94_411f_885d_cc01490e88a0.slice/crio-16a47cefbd90a55ebd2189b0602d8317198d6ba10d1c70a03d671d2ea838f62f WatchSource:0}: Error finding container 16a47cefbd90a55ebd2189b0602d8317198d6ba10d1c70a03d671d2ea838f62f: Status 404 returned error can't find the container with id 16a47cefbd90a55ebd2189b0602d8317198d6ba10d1c70a03d671d2ea838f62f Apr 17 11:15:40.367225 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:40.367201 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec49f65a_cac1_4bb8_8dd5_f77b34ef2282.slice/crio-9a48ebd4b5134dfc48ef01c172c8cdbf3945274da3cb20c22d053240cc3cf31b WatchSource:0}: Error finding container 9a48ebd4b5134dfc48ef01c172c8cdbf3945274da3cb20c22d053240cc3cf31b: Status 404 returned error can't find the container with id 9a48ebd4b5134dfc48ef01c172c8cdbf3945274da3cb20c22d053240cc3cf31b Apr 17 11:15:40.388502 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:40.388476 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod431e03f9_9af4_4fa7_8f47_c50f52e2a7e5.slice/crio-cc942ebcd9fb8ded738ee6de42dfe60102f14f77fdadd7c923d4bd17a1dfc1b7 WatchSource:0}: Error finding container cc942ebcd9fb8ded738ee6de42dfe60102f14f77fdadd7c923d4bd17a1dfc1b7: Status 404 returned error can't find the container with id cc942ebcd9fb8ded738ee6de42dfe60102f14f77fdadd7c923d4bd17a1dfc1b7 Apr 17 11:15:40.390312 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:40.390248 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a2631b2_add8_43b1_a9b4_b872018c7373.slice/crio-c00032419812070bc9592949fbe85baf43b84cdfe7dcd4838979ffb8762d0607 WatchSource:0}: Error finding container c00032419812070bc9592949fbe85baf43b84cdfe7dcd4838979ffb8762d0607: Status 404 returned error can't find the container with id c00032419812070bc9592949fbe85baf43b84cdfe7dcd4838979ffb8762d0607 Apr 17 11:15:40.391414 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:40.391375 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfeeaed08_a6f0_498e_827c_56a07f3c55d7.slice/crio-801b9337332f76fa7306b46d0e1186a70fa5b2a4641e282452bf0242233be902 WatchSource:0}: Error finding container 801b9337332f76fa7306b46d0e1186a70fa5b2a4641e282452bf0242233be902: Status 404 returned error can't find the container with id 801b9337332f76fa7306b46d0e1186a70fa5b2a4641e282452bf0242233be902 Apr 17 11:15:40.392180 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:40.392108 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod841bd702_cde2_4bf5_9789_aa664c501f8f.slice/crio-00c09c65a1606cda6153f911b0729dfad702c97ad3ddc7a1225304bf7e787515 WatchSource:0}: Error finding container 00c09c65a1606cda6153f911b0729dfad702c97ad3ddc7a1225304bf7e787515: Status 404 returned error can't find the container with id 00c09c65a1606cda6153f911b0729dfad702c97ad3ddc7a1225304bf7e787515 Apr 17 11:15:40.392394 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:15:40.392300 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7920eeb8_72c9_4fe5_aff5_30f78ed7f840.slice/crio-200fd97ef6a8cb7928a40899be1a8c8131e82f3c2fcaa64c72e09d27640e7841 WatchSource:0}: Error finding container 200fd97ef6a8cb7928a40899be1a8c8131e82f3c2fcaa64c72e09d27640e7841: Status 404 returned error can't find the container with id 200fd97ef6a8cb7928a40899be1a8c8131e82f3c2fcaa64c72e09d27640e7841 Apr 17 11:15:40.678644 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:40.678442 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 11:10:38 +0000 UTC" deadline="2028-01-07 18:47:35.202240822 +0000 UTC" Apr 17 11:15:40.678644 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:40.678639 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15127h31m54.523605656s" Apr 17 11:15:40.707656 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:40.707593 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2678k" event={"ID":"0a2631b2-add8-43b1-a9b4-b872018c7373","Type":"ContainerStarted","Data":"c00032419812070bc9592949fbe85baf43b84cdfe7dcd4838979ffb8762d0607"} Apr 17 11:15:40.710026 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:40.709995 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dvnsm" event={"ID":"841bd702-cde2-4bf5-9789-aa664c501f8f","Type":"ContainerStarted","Data":"00c09c65a1606cda6153f911b0729dfad702c97ad3ddc7a1225304bf7e787515"} Apr 17 11:15:40.711658 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:40.711630 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" event={"ID":"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5","Type":"ContainerStarted","Data":"cc942ebcd9fb8ded738ee6de42dfe60102f14f77fdadd7c923d4bd17a1dfc1b7"} Apr 17 11:15:40.714897 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:40.714871 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-t9jv4" event={"ID":"ec49f65a-cac1-4bb8-8dd5-f77b34ef2282","Type":"ContainerStarted","Data":"9a48ebd4b5134dfc48ef01c172c8cdbf3945274da3cb20c22d053240cc3cf31b"} Apr 17 11:15:40.715960 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:40.715931 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" event={"ID":"bed77416-eb94-411f-885d-cc01490e88a0","Type":"ContainerStarted","Data":"16a47cefbd90a55ebd2189b0602d8317198d6ba10d1c70a03d671d2ea838f62f"} Apr 17 11:15:40.717345 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:40.717320 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrrv5" event={"ID":"feeaed08-a6f0-498e-827c-56a07f3c55d7","Type":"ContainerStarted","Data":"801b9337332f76fa7306b46d0e1186a70fa5b2a4641e282452bf0242233be902"} Apr 17 11:15:40.718854 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:40.718508 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-86dh5" event={"ID":"389cf577-bd02-4903-96d9-cdc3fd99d418","Type":"ContainerStarted","Data":"429c79db0e076cfaa8e436f7e725ac2e4707aa3871bf11b7d6ae5f49c63da28b"} Apr 17 11:15:40.720026 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:40.720003 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-46jrw" event={"ID":"5652250a-f654-49f4-a3fa-82e77fa0b777","Type":"ContainerStarted","Data":"38711b229bb04ec70d22e653effe7680cf9be0a6f82c4e8762abfe65b1ad7491"} Apr 17 11:15:40.723856 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:40.723834 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-190.ec2.internal" event={"ID":"1a8754b0f8554418b50db3ceda052dae","Type":"ContainerStarted","Data":"07178df4cd52e6bb4eb0a917e913970751d9d1c71547bd2696c2e7728db98ae9"} Apr 17 11:15:40.725948 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:40.725919 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jrxcq" event={"ID":"7920eeb8-72c9-4fe5-aff5-30f78ed7f840","Type":"ContainerStarted","Data":"200fd97ef6a8cb7928a40899be1a8c8131e82f3c2fcaa64c72e09d27640e7841"} Apr 17 11:15:41.259892 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:41.259838 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lfdz4\" (UniqueName: \"kubernetes.io/projected/bfa20876-9d47-42bf-aad5-24503e05b86e-kube-api-access-lfdz4\") pod \"network-check-target-47nt5\" (UID: \"bfa20876-9d47-42bf-aad5-24503e05b86e\") " pod="openshift-network-diagnostics/network-check-target-47nt5" Apr 17 11:15:41.260097 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:41.259928 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0ba74b24-e523-481e-82b5-080dc7ecb2e2-metrics-certs\") pod \"network-metrics-daemon-9g7pq\" (UID: \"0ba74b24-e523-481e-82b5-080dc7ecb2e2\") " pod="openshift-multus/network-metrics-daemon-9g7pq" Apr 17 11:15:41.260097 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:41.260041 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:15:41.260097 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:41.260068 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:15:41.260097 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:41.260082 2579 projected.go:194] Error preparing data for projected volume kube-api-access-lfdz4 for pod openshift-network-diagnostics/network-check-target-47nt5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:15:41.260097 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:41.260091 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:15:41.260342 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:41.260145 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bfa20876-9d47-42bf-aad5-24503e05b86e-kube-api-access-lfdz4 podName:bfa20876-9d47-42bf-aad5-24503e05b86e nodeName:}" failed. No retries permitted until 2026-04-17 11:15:43.260126091 +0000 UTC m=+6.126650236 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-lfdz4" (UniqueName: "kubernetes.io/projected/bfa20876-9d47-42bf-aad5-24503e05b86e-kube-api-access-lfdz4") pod "network-check-target-47nt5" (UID: "bfa20876-9d47-42bf-aad5-24503e05b86e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:15:41.260342 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:41.260165 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ba74b24-e523-481e-82b5-080dc7ecb2e2-metrics-certs podName:0ba74b24-e523-481e-82b5-080dc7ecb2e2 nodeName:}" failed. No retries permitted until 2026-04-17 11:15:43.260155414 +0000 UTC m=+6.126679545 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0ba74b24-e523-481e-82b5-080dc7ecb2e2-metrics-certs") pod "network-metrics-daemon-9g7pq" (UID: "0ba74b24-e523-481e-82b5-080dc7ecb2e2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:15:41.701147 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:41.701114 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-47nt5" Apr 17 11:15:41.701619 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:41.701237 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-47nt5" podUID="bfa20876-9d47-42bf-aad5-24503e05b86e" Apr 17 11:15:41.701684 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:41.701654 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9g7pq" Apr 17 11:15:41.701801 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:41.701761 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9g7pq" podUID="0ba74b24-e523-481e-82b5-080dc7ecb2e2" Apr 17 11:15:41.748703 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:41.747294 2579 generic.go:358] "Generic (PLEG): container finished" podID="4de89c435edead38b0b3647c59aadbd2" containerID="6abdaf348ac323845f218188a1fa966a11de2477d7c9168b11fed3e5b92d0b54" exitCode=0 Apr 17 11:15:41.748703 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:41.748457 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal" event={"ID":"4de89c435edead38b0b3647c59aadbd2","Type":"ContainerDied","Data":"6abdaf348ac323845f218188a1fa966a11de2477d7c9168b11fed3e5b92d0b54"} Apr 17 11:15:41.762677 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:41.761724 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-190.ec2.internal" podStartSLOduration=3.761707575 podStartE2EDuration="3.761707575s" podCreationTimestamp="2026-04-17 11:15:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:15:40.736032022 +0000 UTC m=+3.602556175" watchObservedRunningTime="2026-04-17 11:15:41.761707575 +0000 UTC m=+4.628231726" Apr 17 11:15:42.770060 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:42.770026 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal" event={"ID":"4de89c435edead38b0b3647c59aadbd2","Type":"ContainerStarted","Data":"a50c0dc546243b092ece74274dc85f6889b4e04573571b87a0275554f380fb3c"} Apr 17 11:15:43.278456 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:43.278409 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0ba74b24-e523-481e-82b5-080dc7ecb2e2-metrics-certs\") pod \"network-metrics-daemon-9g7pq\" (UID: \"0ba74b24-e523-481e-82b5-080dc7ecb2e2\") " pod="openshift-multus/network-metrics-daemon-9g7pq" Apr 17 11:15:43.278642 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:43.278474 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lfdz4\" (UniqueName: \"kubernetes.io/projected/bfa20876-9d47-42bf-aad5-24503e05b86e-kube-api-access-lfdz4\") pod \"network-check-target-47nt5\" (UID: \"bfa20876-9d47-42bf-aad5-24503e05b86e\") " pod="openshift-network-diagnostics/network-check-target-47nt5" Apr 17 11:15:43.278642 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:43.278612 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:15:43.278642 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:43.278635 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:15:43.278861 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:43.278651 2579 projected.go:194] Error preparing data for projected volume kube-api-access-lfdz4 for pod openshift-network-diagnostics/network-check-target-47nt5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:15:43.278861 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:43.278712 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bfa20876-9d47-42bf-aad5-24503e05b86e-kube-api-access-lfdz4 podName:bfa20876-9d47-42bf-aad5-24503e05b86e nodeName:}" failed. No retries permitted until 2026-04-17 11:15:47.278694811 +0000 UTC m=+10.145218940 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-lfdz4" (UniqueName: "kubernetes.io/projected/bfa20876-9d47-42bf-aad5-24503e05b86e-kube-api-access-lfdz4") pod "network-check-target-47nt5" (UID: "bfa20876-9d47-42bf-aad5-24503e05b86e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:15:43.278861 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:43.278611 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:15:43.278861 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:43.278829 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ba74b24-e523-481e-82b5-080dc7ecb2e2-metrics-certs podName:0ba74b24-e523-481e-82b5-080dc7ecb2e2 nodeName:}" failed. No retries permitted until 2026-04-17 11:15:47.278813648 +0000 UTC m=+10.145337789 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0ba74b24-e523-481e-82b5-080dc7ecb2e2-metrics-certs") pod "network-metrics-daemon-9g7pq" (UID: "0ba74b24-e523-481e-82b5-080dc7ecb2e2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:15:43.699630 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:43.699335 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-47nt5" Apr 17 11:15:43.699630 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:43.699335 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9g7pq" Apr 17 11:15:43.699630 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:43.699466 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-47nt5" podUID="bfa20876-9d47-42bf-aad5-24503e05b86e" Apr 17 11:15:43.699630 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:43.699576 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9g7pq" podUID="0ba74b24-e523-481e-82b5-080dc7ecb2e2" Apr 17 11:15:45.699145 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:45.698914 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9g7pq" Apr 17 11:15:45.699145 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:45.699033 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9g7pq" podUID="0ba74b24-e523-481e-82b5-080dc7ecb2e2" Apr 17 11:15:45.699808 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:45.699743 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-47nt5" Apr 17 11:15:45.699924 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:45.699868 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-47nt5" podUID="bfa20876-9d47-42bf-aad5-24503e05b86e" Apr 17 11:15:47.316388 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:47.315922 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lfdz4\" (UniqueName: \"kubernetes.io/projected/bfa20876-9d47-42bf-aad5-24503e05b86e-kube-api-access-lfdz4\") pod \"network-check-target-47nt5\" (UID: \"bfa20876-9d47-42bf-aad5-24503e05b86e\") " pod="openshift-network-diagnostics/network-check-target-47nt5" Apr 17 11:15:47.316388 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:47.316022 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0ba74b24-e523-481e-82b5-080dc7ecb2e2-metrics-certs\") pod \"network-metrics-daemon-9g7pq\" (UID: \"0ba74b24-e523-481e-82b5-080dc7ecb2e2\") " pod="openshift-multus/network-metrics-daemon-9g7pq" Apr 17 11:15:47.316388 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:47.316110 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:15:47.316388 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:47.316171 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ba74b24-e523-481e-82b5-080dc7ecb2e2-metrics-certs podName:0ba74b24-e523-481e-82b5-080dc7ecb2e2 nodeName:}" failed. No retries permitted until 2026-04-17 11:15:55.316153237 +0000 UTC m=+18.182677367 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0ba74b24-e523-481e-82b5-080dc7ecb2e2-metrics-certs") pod "network-metrics-daemon-9g7pq" (UID: "0ba74b24-e523-481e-82b5-080dc7ecb2e2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:15:47.316388 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:47.316111 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:15:47.316388 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:47.316235 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:15:47.316388 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:47.316247 2579 projected.go:194] Error preparing data for projected volume kube-api-access-lfdz4 for pod openshift-network-diagnostics/network-check-target-47nt5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:15:47.316388 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:47.316289 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bfa20876-9d47-42bf-aad5-24503e05b86e-kube-api-access-lfdz4 podName:bfa20876-9d47-42bf-aad5-24503e05b86e nodeName:}" failed. No retries permitted until 2026-04-17 11:15:55.31627735 +0000 UTC m=+18.182801491 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-lfdz4" (UniqueName: "kubernetes.io/projected/bfa20876-9d47-42bf-aad5-24503e05b86e-kube-api-access-lfdz4") pod "network-check-target-47nt5" (UID: "bfa20876-9d47-42bf-aad5-24503e05b86e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:15:47.700037 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:47.700005 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-47nt5" Apr 17 11:15:47.700708 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:47.700086 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9g7pq" Apr 17 11:15:47.700708 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:47.700365 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-47nt5" podUID="bfa20876-9d47-42bf-aad5-24503e05b86e" Apr 17 11:15:47.700708 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:47.700444 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9g7pq" podUID="0ba74b24-e523-481e-82b5-080dc7ecb2e2" Apr 17 11:15:49.698884 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:49.698797 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-47nt5" Apr 17 11:15:49.699229 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:49.698915 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-47nt5" podUID="bfa20876-9d47-42bf-aad5-24503e05b86e" Apr 17 11:15:49.699229 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:49.698976 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9g7pq" Apr 17 11:15:49.699229 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:49.699079 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9g7pq" podUID="0ba74b24-e523-481e-82b5-080dc7ecb2e2" Apr 17 11:15:51.699053 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:51.699010 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-47nt5" Apr 17 11:15:51.699053 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:51.699050 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9g7pq" Apr 17 11:15:51.699492 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:51.699129 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-47nt5" podUID="bfa20876-9d47-42bf-aad5-24503e05b86e" Apr 17 11:15:51.699492 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:51.699264 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9g7pq" podUID="0ba74b24-e523-481e-82b5-080dc7ecb2e2" Apr 17 11:15:53.698835 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:53.698797 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9g7pq" Apr 17 11:15:53.699281 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:53.698849 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-47nt5" Apr 17 11:15:53.699281 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:53.698949 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9g7pq" podUID="0ba74b24-e523-481e-82b5-080dc7ecb2e2" Apr 17 11:15:53.699281 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:53.699068 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-47nt5" podUID="bfa20876-9d47-42bf-aad5-24503e05b86e" Apr 17 11:15:55.374337 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:55.374294 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0ba74b24-e523-481e-82b5-080dc7ecb2e2-metrics-certs\") pod \"network-metrics-daemon-9g7pq\" (UID: \"0ba74b24-e523-481e-82b5-080dc7ecb2e2\") " pod="openshift-multus/network-metrics-daemon-9g7pq" Apr 17 11:15:55.374777 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:55.374352 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lfdz4\" (UniqueName: \"kubernetes.io/projected/bfa20876-9d47-42bf-aad5-24503e05b86e-kube-api-access-lfdz4\") pod \"network-check-target-47nt5\" (UID: \"bfa20876-9d47-42bf-aad5-24503e05b86e\") " pod="openshift-network-diagnostics/network-check-target-47nt5" Apr 17 11:15:55.374777 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:55.374457 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:15:55.374777 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:55.374480 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:15:55.374777 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:55.374493 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:15:55.374777 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:55.374505 2579 projected.go:194] Error preparing data for projected volume kube-api-access-lfdz4 for pod openshift-network-diagnostics/network-check-target-47nt5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:15:55.374777 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:55.374524 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ba74b24-e523-481e-82b5-080dc7ecb2e2-metrics-certs podName:0ba74b24-e523-481e-82b5-080dc7ecb2e2 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:11.374507859 +0000 UTC m=+34.241031992 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0ba74b24-e523-481e-82b5-080dc7ecb2e2-metrics-certs") pod "network-metrics-daemon-9g7pq" (UID: "0ba74b24-e523-481e-82b5-080dc7ecb2e2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:15:55.374777 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:55.374555 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bfa20876-9d47-42bf-aad5-24503e05b86e-kube-api-access-lfdz4 podName:bfa20876-9d47-42bf-aad5-24503e05b86e nodeName:}" failed. No retries permitted until 2026-04-17 11:16:11.374537129 +0000 UTC m=+34.241061261 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-lfdz4" (UniqueName: "kubernetes.io/projected/bfa20876-9d47-42bf-aad5-24503e05b86e-kube-api-access-lfdz4") pod "network-check-target-47nt5" (UID: "bfa20876-9d47-42bf-aad5-24503e05b86e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:15:55.699570 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:55.699532 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-47nt5" Apr 17 11:15:55.699743 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:55.699652 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-47nt5" podUID="bfa20876-9d47-42bf-aad5-24503e05b86e" Apr 17 11:15:55.699743 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:55.699714 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9g7pq" Apr 17 11:15:55.699890 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:55.699851 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9g7pq" podUID="0ba74b24-e523-481e-82b5-080dc7ecb2e2" Apr 17 11:15:57.704361 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:57.704029 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-47nt5" Apr 17 11:15:57.704361 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:57.704300 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-47nt5" podUID="bfa20876-9d47-42bf-aad5-24503e05b86e" Apr 17 11:15:57.704361 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:57.704350 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9g7pq" Apr 17 11:15:57.705721 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:57.704427 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9g7pq" podUID="0ba74b24-e523-481e-82b5-080dc7ecb2e2" Apr 17 11:15:57.793837 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:57.793801 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jrxcq" event={"ID":"7920eeb8-72c9-4fe5-aff5-30f78ed7f840","Type":"ContainerStarted","Data":"c36efadc3c48f5078a435bab4eec340d8c7da8586b7f7508664fe530b2322703"} Apr 17 11:15:57.795062 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:57.795038 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2678k" event={"ID":"0a2631b2-add8-43b1-a9b4-b872018c7373","Type":"ContainerStarted","Data":"758636fbc8416ab8effab0d9a12d4b030bb3014b69163abac7462d392fc05f4b"} Apr 17 11:15:57.796253 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:57.796233 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dvnsm" event={"ID":"841bd702-cde2-4bf5-9789-aa664c501f8f","Type":"ContainerStarted","Data":"ad09e439855117b6500d07a35636e7e172198eb71ba7d5847e84b2c1052dc54f"} Apr 17 11:15:57.797420 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:57.797400 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" event={"ID":"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5","Type":"ContainerStarted","Data":"4e6fbbd1ec618835432a852854e4f1bc81c41a6e10985cb1dc6ef060b300abe1"} Apr 17 11:15:57.798616 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:57.798595 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" event={"ID":"bed77416-eb94-411f-885d-cc01490e88a0","Type":"ContainerStarted","Data":"3ba944c00d3e4ff2479c7cbf134e806f750c55a135d3156d147c802e1d479757"} Apr 17 11:15:57.799747 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:57.799730 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrrv5" event={"ID":"feeaed08-a6f0-498e-827c-56a07f3c55d7","Type":"ContainerStarted","Data":"bc63f8f5812d3a5dd6b2fab5c18b0731073127394fd3ac762cb2d10d3fde7b05"} Apr 17 11:15:57.800891 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:57.800870 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-86dh5" event={"ID":"389cf577-bd02-4903-96d9-cdc3fd99d418","Type":"ContainerStarted","Data":"0b5d6a48be425d1ae26332d5c9ab85b7aeba93967f34cb069427cc662741c750"} Apr 17 11:15:57.802225 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:57.802203 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-46jrw" event={"ID":"5652250a-f654-49f4-a3fa-82e77fa0b777","Type":"ContainerStarted","Data":"5da934c789103c656235d32f3a6cc5d0f8db8f76a649cc7270e528c63738831a"} Apr 17 11:15:57.804986 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:57.804949 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-jrxcq" podStartSLOduration=3.7619371 podStartE2EDuration="20.804938844s" podCreationTimestamp="2026-04-17 11:15:37 +0000 UTC" firstStartedPulling="2026-04-17 11:15:40.395240459 +0000 UTC m=+3.261764591" lastFinishedPulling="2026-04-17 11:15:57.438242195 +0000 UTC m=+20.304766335" observedRunningTime="2026-04-17 11:15:57.804920023 +0000 UTC m=+20.671444173" watchObservedRunningTime="2026-04-17 11:15:57.804938844 +0000 UTC m=+20.671462993" Apr 17 11:15:57.805103 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:57.805073 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal" podStartSLOduration=19.805066161 podStartE2EDuration="19.805066161s" podCreationTimestamp="2026-04-17 11:15:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:15:42.798063416 +0000 UTC m=+5.664587568" watchObservedRunningTime="2026-04-17 11:15:57.805066161 +0000 UTC m=+20.671590311" Apr 17 11:15:57.816404 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:57.816370 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-46jrw" podStartSLOduration=2.76681643 podStartE2EDuration="19.816358779s" podCreationTimestamp="2026-04-17 11:15:38 +0000 UTC" firstStartedPulling="2026-04-17 11:15:40.362080131 +0000 UTC m=+3.228604260" lastFinishedPulling="2026-04-17 11:15:57.411622462 +0000 UTC m=+20.278146609" observedRunningTime="2026-04-17 11:15:57.816051638 +0000 UTC m=+20.682575788" watchObservedRunningTime="2026-04-17 11:15:57.816358779 +0000 UTC m=+20.682882950" Apr 17 11:15:57.828617 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:57.828575 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-86dh5" podStartSLOduration=3.7442460669999997 podStartE2EDuration="20.828563113s" podCreationTimestamp="2026-04-17 11:15:37 +0000 UTC" firstStartedPulling="2026-04-17 11:15:40.363379067 +0000 UTC m=+3.229903203" lastFinishedPulling="2026-04-17 11:15:57.447696117 +0000 UTC m=+20.314220249" observedRunningTime="2026-04-17 11:15:57.828123841 +0000 UTC m=+20.694647990" watchObservedRunningTime="2026-04-17 11:15:57.828563113 +0000 UTC m=+20.695087262" Apr 17 11:15:57.853156 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:57.853115 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-2678k" podStartSLOduration=7.449495506 podStartE2EDuration="19.853102461s" podCreationTimestamp="2026-04-17 11:15:38 +0000 UTC" firstStartedPulling="2026-04-17 11:15:40.395140619 +0000 UTC m=+3.261664751" lastFinishedPulling="2026-04-17 11:15:52.798747564 +0000 UTC m=+15.665271706" observedRunningTime="2026-04-17 11:15:57.852647417 +0000 UTC m=+20.719171567" watchObservedRunningTime="2026-04-17 11:15:57.853102461 +0000 UTC m=+20.719626611" Apr 17 11:15:57.865015 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:57.864967 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-zc9p4" podStartSLOduration=2.789463874 podStartE2EDuration="19.864950102s" podCreationTimestamp="2026-04-17 11:15:38 +0000 UTC" firstStartedPulling="2026-04-17 11:15:40.365155918 +0000 UTC m=+3.231680046" lastFinishedPulling="2026-04-17 11:15:57.440642139 +0000 UTC m=+20.307166274" observedRunningTime="2026-04-17 11:15:57.864538348 +0000 UTC m=+20.731062508" watchObservedRunningTime="2026-04-17 11:15:57.864950102 +0000 UTC m=+20.731474253" Apr 17 11:15:58.805281 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:58.805248 2579 generic.go:358] "Generic (PLEG): container finished" podID="841bd702-cde2-4bf5-9789-aa664c501f8f" containerID="ad09e439855117b6500d07a35636e7e172198eb71ba7d5847e84b2c1052dc54f" exitCode=0 Apr 17 11:15:58.806031 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:58.805324 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dvnsm" event={"ID":"841bd702-cde2-4bf5-9789-aa664c501f8f","Type":"ContainerDied","Data":"ad09e439855117b6500d07a35636e7e172198eb71ba7d5847e84b2c1052dc54f"} Apr 17 11:15:58.807789 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:58.807757 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lxcn4_431e03f9-9af4-4fa7-8f47-c50f52e2a7e5/ovn-acl-logging/0.log" Apr 17 11:15:58.808189 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:58.808166 2579 generic.go:358] "Generic (PLEG): container finished" podID="431e03f9-9af4-4fa7-8f47-c50f52e2a7e5" containerID="f7a8d98a841d6b7e942935391f1744345bbbd98f120d4b6ad0b686b8a9f1847c" exitCode=1 Apr 17 11:15:58.808348 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:58.808288 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" event={"ID":"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5","Type":"ContainerStarted","Data":"b33b4ba2c0c4853a5377d5a572f49d1ae25cae351035fa2626752f4678594b16"} Apr 17 11:15:58.808348 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:58.808318 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" event={"ID":"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5","Type":"ContainerStarted","Data":"929f03970447eb035e9b0acdef40545d119f6867440697933ebe0d5762202487"} Apr 17 11:15:58.808348 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:58.808331 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" event={"ID":"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5","Type":"ContainerStarted","Data":"4c064e37985315442706cc4291e28091ca4f0d45c1149cc297ae6d0221d41e07"} Apr 17 11:15:58.808503 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:58.808352 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" event={"ID":"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5","Type":"ContainerStarted","Data":"6376c46021428325157288b5f679dfd2fc03afa186bd93ff6736a882c40d2c06"} Apr 17 11:15:58.808503 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:58.808368 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" event={"ID":"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5","Type":"ContainerDied","Data":"f7a8d98a841d6b7e942935391f1744345bbbd98f120d4b6ad0b686b8a9f1847c"} Apr 17 11:15:59.144483 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:59.144436 2579 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 11:15:59.699591 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:59.699395 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9g7pq" Apr 17 11:15:59.699933 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:59.699390 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-47nt5" Apr 17 11:15:59.699933 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:59.699696 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9g7pq" podUID="0ba74b24-e523-481e-82b5-080dc7ecb2e2" Apr 17 11:15:59.699933 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:15:59.699760 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-47nt5" podUID="bfa20876-9d47-42bf-aad5-24503e05b86e" Apr 17 11:15:59.703559 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:59.703453 2579 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T11:15:59.144463255Z","UUID":"64afa4c1-9560-48cf-aa23-e7a2676835cc","Handler":null,"Name":"","Endpoint":""} Apr 17 11:15:59.705612 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:59.705592 2579 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 11:15:59.705736 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:59.705642 2579 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 11:15:59.811799 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:59.811731 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-t9jv4" event={"ID":"ec49f65a-cac1-4bb8-8dd5-f77b34ef2282","Type":"ContainerStarted","Data":"28e55251738974dc5a62cf4e1fc294c2d512c7e26ef0a003b2e7d02d5eb75056"} Apr 17 11:15:59.813688 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:59.813655 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrrv5" event={"ID":"feeaed08-a6f0-498e-827c-56a07f3c55d7","Type":"ContainerStarted","Data":"60c5514c27c364ce6ab0abac3e902e30e8043c103074ad282cb1015619fe4aa4"} Apr 17 11:15:59.834044 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:15:59.833992 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-t9jv4" podStartSLOduration=5.783016133 podStartE2EDuration="22.833976186s" podCreationTimestamp="2026-04-17 11:15:37 +0000 UTC" firstStartedPulling="2026-04-17 11:15:40.387280908 +0000 UTC m=+3.253805038" lastFinishedPulling="2026-04-17 11:15:57.438240959 +0000 UTC m=+20.304765091" observedRunningTime="2026-04-17 11:15:59.833553035 +0000 UTC m=+22.700077186" watchObservedRunningTime="2026-04-17 11:15:59.833976186 +0000 UTC m=+22.700500336" Apr 17 11:16:00.102362 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:00.102280 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-46jrw" Apr 17 11:16:00.411743 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:00.411714 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-46jrw" Apr 17 11:16:00.417555 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:00.417529 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-46jrw" Apr 17 11:16:00.818703 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:00.818668 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lxcn4_431e03f9-9af4-4fa7-8f47-c50f52e2a7e5/ovn-acl-logging/0.log" Apr 17 11:16:00.819150 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:00.819076 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" event={"ID":"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5","Type":"ContainerStarted","Data":"d6e5064e962f84ab1d465f8421625278b8c96186ca33ee74cbd362c80d210881"} Apr 17 11:16:00.821117 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:00.821080 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrrv5" event={"ID":"feeaed08-a6f0-498e-827c-56a07f3c55d7","Type":"ContainerStarted","Data":"6b8f67be3721ffe7cbc3cec511c9cb426354f4699473cedc7df1863a63d05183"} Apr 17 11:16:00.821887 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:00.821867 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-46jrw" Apr 17 11:16:00.853832 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:00.853763 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrrv5" podStartSLOduration=2.8727430160000003 podStartE2EDuration="22.853745653s" podCreationTimestamp="2026-04-17 11:15:38 +0000 UTC" firstStartedPulling="2026-04-17 11:15:40.395238259 +0000 UTC m=+3.261762399" lastFinishedPulling="2026-04-17 11:16:00.376240893 +0000 UTC m=+23.242765036" observedRunningTime="2026-04-17 11:16:00.838091347 +0000 UTC m=+23.704615488" watchObservedRunningTime="2026-04-17 11:16:00.853745653 +0000 UTC m=+23.720269802" Apr 17 11:16:01.699351 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:01.699314 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-47nt5" Apr 17 11:16:01.699551 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:01.699428 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-47nt5" podUID="bfa20876-9d47-42bf-aad5-24503e05b86e" Apr 17 11:16:01.699551 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:01.699495 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9g7pq" Apr 17 11:16:01.699674 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:01.699620 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9g7pq" podUID="0ba74b24-e523-481e-82b5-080dc7ecb2e2" Apr 17 11:16:02.830073 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:02.828694 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lxcn4_431e03f9-9af4-4fa7-8f47-c50f52e2a7e5/ovn-acl-logging/0.log" Apr 17 11:16:02.830073 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:02.829486 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" event={"ID":"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5","Type":"ContainerStarted","Data":"a151c0742b9c26ebe243a0d2dbc0cf79e73b5784bc626518283f1d445a0f8b6a"} Apr 17 11:16:02.830073 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:02.829890 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:16:02.830073 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:02.829994 2579 scope.go:117] "RemoveContainer" containerID="f7a8d98a841d6b7e942935391f1744345bbbd98f120d4b6ad0b686b8a9f1847c" Apr 17 11:16:02.830073 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:02.830007 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:16:02.831387 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:02.830085 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:16:02.852191 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:02.852040 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:16:02.852933 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:02.852853 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:16:03.699367 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:03.699321 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-47nt5" Apr 17 11:16:03.699367 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:03.699356 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9g7pq" Apr 17 11:16:03.699547 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:03.699478 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-47nt5" podUID="bfa20876-9d47-42bf-aad5-24503e05b86e" Apr 17 11:16:03.699601 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:03.699551 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9g7pq" podUID="0ba74b24-e523-481e-82b5-080dc7ecb2e2" Apr 17 11:16:03.832594 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:03.832560 2579 generic.go:358] "Generic (PLEG): container finished" podID="841bd702-cde2-4bf5-9789-aa664c501f8f" containerID="7a1582c23def6cf3bc93499eb295836ea4a3058cca6a5e8fa17bbeaa26480dde" exitCode=0 Apr 17 11:16:03.833080 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:03.832640 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dvnsm" event={"ID":"841bd702-cde2-4bf5-9789-aa664c501f8f","Type":"ContainerDied","Data":"7a1582c23def6cf3bc93499eb295836ea4a3058cca6a5e8fa17bbeaa26480dde"} Apr 17 11:16:03.835725 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:03.835707 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lxcn4_431e03f9-9af4-4fa7-8f47-c50f52e2a7e5/ovn-acl-logging/0.log" Apr 17 11:16:03.836022 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:03.836002 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" event={"ID":"431e03f9-9af4-4fa7-8f47-c50f52e2a7e5","Type":"ContainerStarted","Data":"f286ba429c74284bfa9148bec10f910aed769b9adb31b74c0bf3373e58d6e644"} Apr 17 11:16:03.877825 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:03.877762 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" podStartSLOduration=8.782150972 podStartE2EDuration="25.87774947s" podCreationTimestamp="2026-04-17 11:15:38 +0000 UTC" firstStartedPulling="2026-04-17 11:15:40.390679631 +0000 UTC m=+3.257203764" lastFinishedPulling="2026-04-17 11:15:57.48627813 +0000 UTC m=+20.352802262" observedRunningTime="2026-04-17 11:16:03.877179273 +0000 UTC m=+26.743703424" watchObservedRunningTime="2026-04-17 11:16:03.87774947 +0000 UTC m=+26.744273616" Apr 17 11:16:04.719052 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:04.718974 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-47nt5"] Apr 17 11:16:04.719179 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:04.719106 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-47nt5" Apr 17 11:16:04.719249 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:04.719225 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-47nt5" podUID="bfa20876-9d47-42bf-aad5-24503e05b86e" Apr 17 11:16:04.721980 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:04.721956 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9g7pq"] Apr 17 11:16:04.722109 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:04.722036 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9g7pq" Apr 17 11:16:04.722147 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:04.722127 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9g7pq" podUID="0ba74b24-e523-481e-82b5-080dc7ecb2e2" Apr 17 11:16:04.840325 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:04.840289 2579 generic.go:358] "Generic (PLEG): container finished" podID="841bd702-cde2-4bf5-9789-aa664c501f8f" containerID="54fb37997acfb681353077ce90f05a805d3ed754b3a4a72fa183d1e048393230" exitCode=0 Apr 17 11:16:04.840690 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:04.840354 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dvnsm" event={"ID":"841bd702-cde2-4bf5-9789-aa664c501f8f","Type":"ContainerDied","Data":"54fb37997acfb681353077ce90f05a805d3ed754b3a4a72fa183d1e048393230"} Apr 17 11:16:05.844324 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:05.844234 2579 generic.go:358] "Generic (PLEG): container finished" podID="841bd702-cde2-4bf5-9789-aa664c501f8f" containerID="ee440a3cec493c1ce4341ef5bcc04b28640da3c00c03f868fb7ed649154d01ab" exitCode=0 Apr 17 11:16:05.844324 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:05.844308 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dvnsm" event={"ID":"841bd702-cde2-4bf5-9789-aa664c501f8f","Type":"ContainerDied","Data":"ee440a3cec493c1ce4341ef5bcc04b28640da3c00c03f868fb7ed649154d01ab"} Apr 17 11:16:06.699758 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:06.699582 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9g7pq" Apr 17 11:16:06.699929 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:06.699581 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-47nt5" Apr 17 11:16:06.699929 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:06.699877 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9g7pq" podUID="0ba74b24-e523-481e-82b5-080dc7ecb2e2" Apr 17 11:16:06.700028 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:06.699999 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-47nt5" podUID="bfa20876-9d47-42bf-aad5-24503e05b86e" Apr 17 11:16:08.699893 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:08.699475 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-47nt5" Apr 17 11:16:08.699893 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:08.699531 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9g7pq" Apr 17 11:16:08.699893 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:08.699610 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-47nt5" podUID="bfa20876-9d47-42bf-aad5-24503e05b86e" Apr 17 11:16:08.699893 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:08.699746 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9g7pq" podUID="0ba74b24-e523-481e-82b5-080dc7ecb2e2" Apr 17 11:16:10.448314 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.448284 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-190.ec2.internal" event="NodeReady" Apr 17 11:16:10.448947 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.448408 2579 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 11:16:10.482993 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.482906 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7lqsd"] Apr 17 11:16:10.487905 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.487564 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-fdqlp"] Apr 17 11:16:10.488579 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.487997 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7lqsd" Apr 17 11:16:10.490727 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.490643 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5b9d896675-7lskv"] Apr 17 11:16:10.490859 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.490843 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-fdqlp" Apr 17 11:16:10.491123 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.491093 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 17 11:16:10.491232 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.491137 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:16:10.491321 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.491298 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-bqhrd\"" Apr 17 11:16:10.491435 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.491416 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 17 11:16:10.493393 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.493355 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 11:16:10.493641 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.493610 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ds9gs"] Apr 17 11:16:10.497514 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.494660 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 11:16:10.497514 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.494881 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5b9d896675-7lskv" Apr 17 11:16:10.500664 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.500412 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 11:16:10.500664 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.500519 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 11:16:10.501647 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.501217 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 11:16:10.502365 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.502346 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-8qcvw\"" Apr 17 11:16:10.505397 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.504815 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-7fq9z\"" Apr 17 11:16:10.509372 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.508495 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dl54m"] Apr 17 11:16:10.509372 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.508842 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 11:16:10.513134 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.513109 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-65j5q"] Apr 17 11:16:10.513345 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.513327 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ds9gs" Apr 17 11:16:10.515226 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.515209 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 17 11:16:10.515474 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.515450 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 17 11:16:10.515625 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.515491 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-p2v9l\"" Apr 17 11:16:10.515701 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.515541 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:16:10.515916 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.515896 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 17 11:16:10.516183 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.516162 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6hppm"] Apr 17 11:16:10.516325 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.516301 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-65j5q" Apr 17 11:16:10.516503 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.516481 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dl54m" Apr 17 11:16:10.519743 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.519214 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:16:10.519883 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.519761 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 17 11:16:10.520284 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.520242 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 17 11:16:10.520434 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.520417 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-5vlf2\"" Apr 17 11:16:10.520607 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.520341 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 17 11:16:10.520746 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.520276 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 11:16:10.521228 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.520742 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-799db9ddcb-l92ps"] Apr 17 11:16:10.521228 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.520890 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6hppm" Apr 17 11:16:10.521878 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.521815 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-pxzns\"" Apr 17 11:16:10.521989 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.521814 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 11:16:10.524541 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.524517 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 17 11:16:10.525706 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.525684 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 17 11:16:10.526192 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.526173 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-99lf7\"" Apr 17 11:16:10.528201 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.526819 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 17 11:16:10.528201 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.526972 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-xjjnj"] Apr 17 11:16:10.528201 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.527228 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:16:10.528680 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.528662 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" Apr 17 11:16:10.532125 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.532103 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-7657d8478-pp7qf"] Apr 17 11:16:10.534144 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.533415 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-xjjnj" Apr 17 11:16:10.535459 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.535440 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 17 11:16:10.535639 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.535624 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 17 11:16:10.536457 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.536276 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-pmnxw\"" Apr 17 11:16:10.537136 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.537114 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-clmj2"] Apr 17 11:16:10.537290 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.537271 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7657d8478-pp7qf" Apr 17 11:16:10.538874 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.538852 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 17 11:16:10.539010 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.538908 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 17 11:16:10.539189 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.539174 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 17 11:16:10.539617 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.539255 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 11:16:10.539888 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.539868 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 17 11:16:10.540531 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.540407 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-9wmp5"] Apr 17 11:16:10.540531 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.540522 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-clmj2" Apr 17 11:16:10.540858 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.540837 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 11:16:10.541032 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.541010 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-swgpj\"" Apr 17 11:16:10.544367 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.544345 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-jbw2t\"" Apr 17 11:16:10.544497 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.544484 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 11:16:10.544983 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.544967 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 17 11:16:10.545361 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.545343 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 11:16:10.545861 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.545844 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 17 11:16:10.546438 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.546417 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-9wmp5" Apr 17 11:16:10.546924 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.546304 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7lqsd"] Apr 17 11:16:10.547016 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.546940 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-fdqlp"] Apr 17 11:16:10.547016 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.546963 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dl54m"] Apr 17 11:16:10.547016 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.546974 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ds9gs"] Apr 17 11:16:10.547016 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.546985 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-65j5q"] Apr 17 11:16:10.547016 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.547004 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5b9d896675-7lskv"] Apr 17 11:16:10.547016 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.547016 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-xjjnj"] Apr 17 11:16:10.547402 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.547028 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-clmj2"] Apr 17 11:16:10.547402 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.547039 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-9wmp5"] Apr 17 11:16:10.547402 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.547049 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6hppm"] Apr 17 11:16:10.547402 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.547061 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-799db9ddcb-l92ps"] Apr 17 11:16:10.547402 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.547073 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-sswv7"] Apr 17 11:16:10.549058 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.549022 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:16:10.549187 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.549101 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 17 11:16:10.549288 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.549272 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 17 11:16:10.549695 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.549680 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 17 11:16:10.549849 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.549800 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-d26vj\"" Apr 17 11:16:10.550740 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.550415 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-tkcmp"] Apr 17 11:16:10.553273 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.553252 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 17 11:16:10.555917 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.555754 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 17 11:16:10.556846 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.556793 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-sswv7"] Apr 17 11:16:10.556846 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.556820 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tkcmp"] Apr 17 11:16:10.556846 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.556829 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7657d8478-pp7qf"] Apr 17 11:16:10.557051 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.556868 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sswv7" Apr 17 11:16:10.557358 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.557326 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tkcmp" Apr 17 11:16:10.558718 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.558571 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-lbntp\"" Apr 17 11:16:10.558718 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.558595 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 11:16:10.558718 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.558632 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 11:16:10.559117 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.559096 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 11:16:10.559264 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.559220 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 11:16:10.559500 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.559479 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 11:16:10.561061 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.561044 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-f8bg7\"" Apr 17 11:16:10.582363 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.582333 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h88t6\" (UniqueName: \"kubernetes.io/projected/92e7a74f-35b8-4f70-833b-1261a2bed50d-kube-api-access-h88t6\") pod \"volume-data-source-validator-7c6cbb6c87-dl54m\" (UID: \"92e7a74f-35b8-4f70-833b-1261a2bed50d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dl54m" Apr 17 11:16:10.582495 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.582375 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/97fe64f0-9f87-4b25-876e-a59829b69c04-ca-trust-extracted\") pod \"image-registry-5b9d896675-7lskv\" (UID: \"97fe64f0-9f87-4b25-876e-a59829b69c04\") " pod="openshift-image-registry/image-registry-5b9d896675-7lskv" Apr 17 11:16:10.582495 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.582400 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/97fe64f0-9f87-4b25-876e-a59829b69c04-registry-certificates\") pod \"image-registry-5b9d896675-7lskv\" (UID: \"97fe64f0-9f87-4b25-876e-a59829b69c04\") " pod="openshift-image-registry/image-registry-5b9d896675-7lskv" Apr 17 11:16:10.582495 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.582484 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44d4ecc4-97ed-4995-a7f7-5f731f3fe770-serving-cert\") pod \"service-ca-operator-d6fc45fc5-ds9gs\" (UID: \"44d4ecc4-97ed-4995-a7f7-5f731f3fe770\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ds9gs" Apr 17 11:16:10.582712 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.582511 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b86b6\" (UniqueName: \"kubernetes.io/projected/a3e70c72-ac65-4a09-b59a-570bf07a6dbb-kube-api-access-b86b6\") pod \"cluster-samples-operator-6dc5bdb6b4-7lqsd\" (UID: \"a3e70c72-ac65-4a09-b59a-570bf07a6dbb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7lqsd" Apr 17 11:16:10.582712 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.582539 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44d4ecc4-97ed-4995-a7f7-5f731f3fe770-config\") pod \"service-ca-operator-d6fc45fc5-ds9gs\" (UID: \"44d4ecc4-97ed-4995-a7f7-5f731f3fe770\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ds9gs" Apr 17 11:16:10.582712 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.582565 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97fe64f0-9f87-4b25-876e-a59829b69c04-registry-tls\") pod \"image-registry-5b9d896675-7lskv\" (UID: \"97fe64f0-9f87-4b25-876e-a59829b69c04\") " pod="openshift-image-registry/image-registry-5b9d896675-7lskv" Apr 17 11:16:10.582712 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.582634 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/cfa96c8c-1c5c-4749-a74a-b6eab2274afd-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-65j5q\" (UID: \"cfa96c8c-1c5c-4749-a74a-b6eab2274afd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-65j5q" Apr 17 11:16:10.582712 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.582658 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/97fe64f0-9f87-4b25-876e-a59829b69c04-image-registry-private-configuration\") pod \"image-registry-5b9d896675-7lskv\" (UID: \"97fe64f0-9f87-4b25-876e-a59829b69c04\") " pod="openshift-image-registry/image-registry-5b9d896675-7lskv" Apr 17 11:16:10.582984 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.582706 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97fe64f0-9f87-4b25-876e-a59829b69c04-bound-sa-token\") pod \"image-registry-5b9d896675-7lskv\" (UID: \"97fe64f0-9f87-4b25-876e-a59829b69c04\") " pod="openshift-image-registry/image-registry-5b9d896675-7lskv" Apr 17 11:16:10.582984 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.582812 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97fe64f0-9f87-4b25-876e-a59829b69c04-trusted-ca\") pod \"image-registry-5b9d896675-7lskv\" (UID: \"97fe64f0-9f87-4b25-876e-a59829b69c04\") " pod="openshift-image-registry/image-registry-5b9d896675-7lskv" Apr 17 11:16:10.582984 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.582849 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/97fe64f0-9f87-4b25-876e-a59829b69c04-installation-pull-secrets\") pod \"image-registry-5b9d896675-7lskv\" (UID: \"97fe64f0-9f87-4b25-876e-a59829b69c04\") " pod="openshift-image-registry/image-registry-5b9d896675-7lskv" Apr 17 11:16:10.582984 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.582883 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp6rx\" (UniqueName: \"kubernetes.io/projected/44d4ecc4-97ed-4995-a7f7-5f731f3fe770-kube-api-access-bp6rx\") pod \"service-ca-operator-d6fc45fc5-ds9gs\" (UID: \"44d4ecc4-97ed-4995-a7f7-5f731f3fe770\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ds9gs" Apr 17 11:16:10.582984 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.582916 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2dxm\" (UniqueName: \"kubernetes.io/projected/97fe64f0-9f87-4b25-876e-a59829b69c04-kube-api-access-r2dxm\") pod \"image-registry-5b9d896675-7lskv\" (UID: \"97fe64f0-9f87-4b25-876e-a59829b69c04\") " pod="openshift-image-registry/image-registry-5b9d896675-7lskv" Apr 17 11:16:10.582984 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.582952 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/cfa96c8c-1c5c-4749-a74a-b6eab2274afd-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-65j5q\" (UID: \"cfa96c8c-1c5c-4749-a74a-b6eab2274afd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-65j5q" Apr 17 11:16:10.582984 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.582981 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3e70c72-ac65-4a09-b59a-570bf07a6dbb-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7lqsd\" (UID: \"a3e70c72-ac65-4a09-b59a-570bf07a6dbb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7lqsd" Apr 17 11:16:10.583301 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.583015 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7txqs\" (UniqueName: \"kubernetes.io/projected/cfa96c8c-1c5c-4749-a74a-b6eab2274afd-kube-api-access-7txqs\") pod \"cluster-monitoring-operator-75587bd455-65j5q\" (UID: \"cfa96c8c-1c5c-4749-a74a-b6eab2274afd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-65j5q" Apr 17 11:16:10.583301 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.583039 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7fwr\" (UniqueName: \"kubernetes.io/projected/8a1529e3-e492-461e-9d34-440e5555b197-kube-api-access-p7fwr\") pod \"network-check-source-8894fc9bd-fdqlp\" (UID: \"8a1529e3-e492-461e-9d34-440e5555b197\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-fdqlp" Apr 17 11:16:10.684121 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.684082 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/cfa96c8c-1c5c-4749-a74a-b6eab2274afd-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-65j5q\" (UID: \"cfa96c8c-1c5c-4749-a74a-b6eab2274afd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-65j5q" Apr 17 11:16:10.684311 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.684131 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-stats-auth\") pod \"router-default-7657d8478-pp7qf\" (UID: \"615b2482-c02f-4752-9ea6-7e7cef5c1fe9\") " pod="openshift-ingress/router-default-7657d8478-pp7qf" Apr 17 11:16:10.684311 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.684151 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/beb961a2-0877-4cd8-be68-062f895cef5d-trusted-ca\") pod \"image-registry-799db9ddcb-l92ps\" (UID: \"beb961a2-0877-4cd8-be68-062f895cef5d\") " pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" Apr 17 11:16:10.684311 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.684183 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/beb961a2-0877-4cd8-be68-062f895cef5d-image-registry-private-configuration\") pod \"image-registry-799db9ddcb-l92ps\" (UID: \"beb961a2-0877-4cd8-be68-062f895cef5d\") " pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" Apr 17 11:16:10.684311 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:10.684258 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 11:16:10.684518 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.684341 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h88t6\" (UniqueName: \"kubernetes.io/projected/92e7a74f-35b8-4f70-833b-1261a2bed50d-kube-api-access-h88t6\") pod \"volume-data-source-validator-7c6cbb6c87-dl54m\" (UID: \"92e7a74f-35b8-4f70-833b-1261a2bed50d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dl54m" Apr 17 11:16:10.684518 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:10.684358 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfa96c8c-1c5c-4749-a74a-b6eab2274afd-cluster-monitoring-operator-tls podName:cfa96c8c-1c5c-4749-a74a-b6eab2274afd nodeName:}" failed. No retries permitted until 2026-04-17 11:16:11.184336359 +0000 UTC m=+34.050860490 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/cfa96c8c-1c5c-4749-a74a-b6eab2274afd-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-65j5q" (UID: "cfa96c8c-1c5c-4749-a74a-b6eab2274afd") : secret "cluster-monitoring-operator-tls" not found Apr 17 11:16:10.684518 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.684394 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b86b6\" (UniqueName: \"kubernetes.io/projected/a3e70c72-ac65-4a09-b59a-570bf07a6dbb-kube-api-access-b86b6\") pod \"cluster-samples-operator-6dc5bdb6b4-7lqsd\" (UID: \"a3e70c72-ac65-4a09-b59a-570bf07a6dbb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7lqsd" Apr 17 11:16:10.684518 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.684424 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-service-ca-bundle\") pod \"router-default-7657d8478-pp7qf\" (UID: \"615b2482-c02f-4752-9ea6-7e7cef5c1fe9\") " pod="openshift-ingress/router-default-7657d8478-pp7qf" Apr 17 11:16:10.684518 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.684451 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89b9086c-55f6-4d0b-a998-d22a793d7d17-config\") pod \"console-operator-9d4b6777b-9wmp5\" (UID: \"89b9086c-55f6-4d0b-a998-d22a793d7d17\") " pod="openshift-console-operator/console-operator-9d4b6777b-9wmp5" Apr 17 11:16:10.684794 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.684577 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44d4ecc4-97ed-4995-a7f7-5f731f3fe770-config\") pod \"service-ca-operator-d6fc45fc5-ds9gs\" (UID: \"44d4ecc4-97ed-4995-a7f7-5f731f3fe770\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ds9gs" Apr 17 11:16:10.684794 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.684610 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97fe64f0-9f87-4b25-876e-a59829b69c04-registry-tls\") pod \"image-registry-5b9d896675-7lskv\" (UID: \"97fe64f0-9f87-4b25-876e-a59829b69c04\") " pod="openshift-image-registry/image-registry-5b9d896675-7lskv" Apr 17 11:16:10.684794 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.684642 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxs58\" (UniqueName: \"kubernetes.io/projected/b2c84c0f-0d60-4465-b1fd-4f39963e95d4-kube-api-access-wxs58\") pod \"kube-storage-version-migrator-operator-6769c5d45-6hppm\" (UID: \"b2c84c0f-0d60-4465-b1fd-4f39963e95d4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6hppm" Apr 17 11:16:10.684794 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.684673 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d90dfee2-676e-4224-8ec8-8d764b523802-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-xjjnj\" (UID: \"d90dfee2-676e-4224-8ec8-8d764b523802\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xjjnj" Apr 17 11:16:10.684794 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.684707 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bp6rx\" (UniqueName: \"kubernetes.io/projected/44d4ecc4-97ed-4995-a7f7-5f731f3fe770-kube-api-access-bp6rx\") pod \"service-ca-operator-d6fc45fc5-ds9gs\" (UID: \"44d4ecc4-97ed-4995-a7f7-5f731f3fe770\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ds9gs" Apr 17 11:16:10.684794 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.684734 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/beb961a2-0877-4cd8-be68-062f895cef5d-bound-sa-token\") pod \"image-registry-799db9ddcb-l92ps\" (UID: \"beb961a2-0877-4cd8-be68-062f895cef5d\") " pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" Apr 17 11:16:10.684794 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.684757 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4dd7876b-a6b7-4cf0-b645-979aead5bdff-tmp-dir\") pod \"dns-default-tkcmp\" (UID: \"4dd7876b-a6b7-4cf0-b645-979aead5bdff\") " pod="openshift-dns/dns-default-tkcmp" Apr 17 11:16:10.685121 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:10.684734 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:16:10.685121 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:10.684815 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5b9d896675-7lskv: secret "image-registry-tls" not found Apr 17 11:16:10.685121 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:10.684895 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/97fe64f0-9f87-4b25-876e-a59829b69c04-registry-tls podName:97fe64f0-9f87-4b25-876e-a59829b69c04 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:11.184868153 +0000 UTC m=+34.051392311 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/97fe64f0-9f87-4b25-876e-a59829b69c04-registry-tls") pod "image-registry-5b9d896675-7lskv" (UID: "97fe64f0-9f87-4b25-876e-a59829b69c04") : secret "image-registry-tls" not found Apr 17 11:16:10.685121 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.684799 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/97fe64f0-9f87-4b25-876e-a59829b69c04-image-registry-private-configuration\") pod \"image-registry-5b9d896675-7lskv\" (UID: \"97fe64f0-9f87-4b25-876e-a59829b69c04\") " pod="openshift-image-registry/image-registry-5b9d896675-7lskv" Apr 17 11:16:10.685121 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.684935 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9wpx\" (UniqueName: \"kubernetes.io/projected/e2778730-467e-4432-9a1a-d5f871276f6d-kube-api-access-z9wpx\") pod \"insights-operator-585dfdc468-clmj2\" (UID: \"e2778730-467e-4432-9a1a-d5f871276f6d\") " pod="openshift-insights/insights-operator-585dfdc468-clmj2" Apr 17 11:16:10.685121 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.684966 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-default-certificate\") pod \"router-default-7657d8478-pp7qf\" (UID: \"615b2482-c02f-4752-9ea6-7e7cef5c1fe9\") " pod="openshift-ingress/router-default-7657d8478-pp7qf" Apr 17 11:16:10.685121 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.684996 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97fe64f0-9f87-4b25-876e-a59829b69c04-trusted-ca\") pod \"image-registry-5b9d896675-7lskv\" (UID: \"97fe64f0-9f87-4b25-876e-a59829b69c04\") " pod="openshift-image-registry/image-registry-5b9d896675-7lskv" Apr 17 11:16:10.685121 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.685028 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/beb961a2-0877-4cd8-be68-062f895cef5d-installation-pull-secrets\") pod \"image-registry-799db9ddcb-l92ps\" (UID: \"beb961a2-0877-4cd8-be68-062f895cef5d\") " pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" Apr 17 11:16:10.685121 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.685058 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4dd7876b-a6b7-4cf0-b645-979aead5bdff-config-volume\") pod \"dns-default-tkcmp\" (UID: \"4dd7876b-a6b7-4cf0-b645-979aead5bdff\") " pod="openshift-dns/dns-default-tkcmp" Apr 17 11:16:10.685121 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.685092 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2dxm\" (UniqueName: \"kubernetes.io/projected/97fe64f0-9f87-4b25-876e-a59829b69c04-kube-api-access-r2dxm\") pod \"image-registry-5b9d896675-7lskv\" (UID: \"97fe64f0-9f87-4b25-876e-a59829b69c04\") " pod="openshift-image-registry/image-registry-5b9d896675-7lskv" Apr 17 11:16:10.685121 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.685119 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2c84c0f-0d60-4465-b1fd-4f39963e95d4-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-6hppm\" (UID: \"b2c84c0f-0d60-4465-b1fd-4f39963e95d4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6hppm" Apr 17 11:16:10.685568 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.685152 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3e70c72-ac65-4a09-b59a-570bf07a6dbb-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7lqsd\" (UID: \"a3e70c72-ac65-4a09-b59a-570bf07a6dbb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7lqsd" Apr 17 11:16:10.685568 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.685178 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4dd7876b-a6b7-4cf0-b645-979aead5bdff-metrics-tls\") pod \"dns-default-tkcmp\" (UID: \"4dd7876b-a6b7-4cf0-b645-979aead5bdff\") " pod="openshift-dns/dns-default-tkcmp" Apr 17 11:16:10.685568 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.685204 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/e2778730-467e-4432-9a1a-d5f871276f6d-snapshots\") pod \"insights-operator-585dfdc468-clmj2\" (UID: \"e2778730-467e-4432-9a1a-d5f871276f6d\") " pod="openshift-insights/insights-operator-585dfdc468-clmj2" Apr 17 11:16:10.685568 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.685230 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44d4ecc4-97ed-4995-a7f7-5f731f3fe770-config\") pod \"service-ca-operator-d6fc45fc5-ds9gs\" (UID: \"44d4ecc4-97ed-4995-a7f7-5f731f3fe770\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ds9gs" Apr 17 11:16:10.685568 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.685235 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7txqs\" (UniqueName: \"kubernetes.io/projected/cfa96c8c-1c5c-4749-a74a-b6eab2274afd-kube-api-access-7txqs\") pod \"cluster-monitoring-operator-75587bd455-65j5q\" (UID: \"cfa96c8c-1c5c-4749-a74a-b6eab2274afd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-65j5q" Apr 17 11:16:10.685568 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.685282 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p7fwr\" (UniqueName: \"kubernetes.io/projected/8a1529e3-e492-461e-9d34-440e5555b197-kube-api-access-p7fwr\") pod \"network-check-source-8894fc9bd-fdqlp\" (UID: \"8a1529e3-e492-461e-9d34-440e5555b197\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-fdqlp" Apr 17 11:16:10.685568 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.685311 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-metrics-certs\") pod \"router-default-7657d8478-pp7qf\" (UID: \"615b2482-c02f-4752-9ea6-7e7cef5c1fe9\") " pod="openshift-ingress/router-default-7657d8478-pp7qf" Apr 17 11:16:10.685568 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.685338 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcwtp\" (UniqueName: \"kubernetes.io/projected/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-kube-api-access-fcwtp\") pod \"router-default-7657d8478-pp7qf\" (UID: \"615b2482-c02f-4752-9ea6-7e7cef5c1fe9\") " pod="openshift-ingress/router-default-7657d8478-pp7qf" Apr 17 11:16:10.685568 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.685364 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mwlm\" (UniqueName: \"kubernetes.io/projected/beb961a2-0877-4cd8-be68-062f895cef5d-kube-api-access-4mwlm\") pod \"image-registry-799db9ddcb-l92ps\" (UID: \"beb961a2-0877-4cd8-be68-062f895cef5d\") " pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" Apr 17 11:16:10.685568 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:10.685385 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 11:16:10.685568 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.685392 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89b9086c-55f6-4d0b-a998-d22a793d7d17-serving-cert\") pod \"console-operator-9d4b6777b-9wmp5\" (UID: \"89b9086c-55f6-4d0b-a998-d22a793d7d17\") " pod="openshift-console-operator/console-operator-9d4b6777b-9wmp5" Apr 17 11:16:10.685568 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:10.685425 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3e70c72-ac65-4a09-b59a-570bf07a6dbb-samples-operator-tls podName:a3e70c72-ac65-4a09-b59a-570bf07a6dbb nodeName:}" failed. No retries permitted until 2026-04-17 11:16:11.185411113 +0000 UTC m=+34.051935241 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/a3e70c72-ac65-4a09-b59a-570bf07a6dbb-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-7lqsd" (UID: "a3e70c72-ac65-4a09-b59a-570bf07a6dbb") : secret "samples-operator-tls" not found Apr 17 11:16:10.685568 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.685443 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/97fe64f0-9f87-4b25-876e-a59829b69c04-ca-trust-extracted\") pod \"image-registry-5b9d896675-7lskv\" (UID: \"97fe64f0-9f87-4b25-876e-a59829b69c04\") " pod="openshift-image-registry/image-registry-5b9d896675-7lskv" Apr 17 11:16:10.685568 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.685470 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/97fe64f0-9f87-4b25-876e-a59829b69c04-registry-certificates\") pod \"image-registry-5b9d896675-7lskv\" (UID: \"97fe64f0-9f87-4b25-876e-a59829b69c04\") " pod="openshift-image-registry/image-registry-5b9d896675-7lskv" Apr 17 11:16:10.685568 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.685496 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e2778730-467e-4432-9a1a-d5f871276f6d-tmp\") pod \"insights-operator-585dfdc468-clmj2\" (UID: \"e2778730-467e-4432-9a1a-d5f871276f6d\") " pod="openshift-insights/insights-operator-585dfdc468-clmj2" Apr 17 11:16:10.686241 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.685521 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2778730-467e-4432-9a1a-d5f871276f6d-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-clmj2\" (UID: \"e2778730-467e-4432-9a1a-d5f871276f6d\") " pod="openshift-insights/insights-operator-585dfdc468-clmj2" Apr 17 11:16:10.686241 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.685551 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/beb961a2-0877-4cd8-be68-062f895cef5d-ca-trust-extracted\") pod \"image-registry-799db9ddcb-l92ps\" (UID: \"beb961a2-0877-4cd8-be68-062f895cef5d\") " pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" Apr 17 11:16:10.686241 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.685594 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44d4ecc4-97ed-4995-a7f7-5f731f3fe770-serving-cert\") pod \"service-ca-operator-d6fc45fc5-ds9gs\" (UID: \"44d4ecc4-97ed-4995-a7f7-5f731f3fe770\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ds9gs" Apr 17 11:16:10.686241 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.685629 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05a441fa-9d9b-40d1-adfd-ffe296dfb2d0-cert\") pod \"ingress-canary-sswv7\" (UID: \"05a441fa-9d9b-40d1-adfd-ffe296dfb2d0\") " pod="openshift-ingress-canary/ingress-canary-sswv7" Apr 17 11:16:10.686241 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.685749 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/89b9086c-55f6-4d0b-a998-d22a793d7d17-trusted-ca\") pod \"console-operator-9d4b6777b-9wmp5\" (UID: \"89b9086c-55f6-4d0b-a998-d22a793d7d17\") " pod="openshift-console-operator/console-operator-9d4b6777b-9wmp5" Apr 17 11:16:10.686241 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.685819 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2778730-467e-4432-9a1a-d5f871276f6d-serving-cert\") pod \"insights-operator-585dfdc468-clmj2\" (UID: \"e2778730-467e-4432-9a1a-d5f871276f6d\") " pod="openshift-insights/insights-operator-585dfdc468-clmj2" Apr 17 11:16:10.686241 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.685864 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8drh\" (UniqueName: \"kubernetes.io/projected/89b9086c-55f6-4d0b-a998-d22a793d7d17-kube-api-access-g8drh\") pod \"console-operator-9d4b6777b-9wmp5\" (UID: \"89b9086c-55f6-4d0b-a998-d22a793d7d17\") " pod="openshift-console-operator/console-operator-9d4b6777b-9wmp5" Apr 17 11:16:10.686241 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.685974 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97fe64f0-9f87-4b25-876e-a59829b69c04-trusted-ca\") pod \"image-registry-5b9d896675-7lskv\" (UID: \"97fe64f0-9f87-4b25-876e-a59829b69c04\") " pod="openshift-image-registry/image-registry-5b9d896675-7lskv" Apr 17 11:16:10.686241 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.685990 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/cfa96c8c-1c5c-4749-a74a-b6eab2274afd-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-65j5q\" (UID: \"cfa96c8c-1c5c-4749-a74a-b6eab2274afd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-65j5q" Apr 17 11:16:10.686241 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.686035 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2c84c0f-0d60-4465-b1fd-4f39963e95d4-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-6hppm\" (UID: \"b2c84c0f-0d60-4465-b1fd-4f39963e95d4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6hppm" Apr 17 11:16:10.686241 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.686088 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/97fe64f0-9f87-4b25-876e-a59829b69c04-ca-trust-extracted\") pod \"image-registry-5b9d896675-7lskv\" (UID: \"97fe64f0-9f87-4b25-876e-a59829b69c04\") " pod="openshift-image-registry/image-registry-5b9d896675-7lskv" Apr 17 11:16:10.686241 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.686143 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97fe64f0-9f87-4b25-876e-a59829b69c04-bound-sa-token\") pod \"image-registry-5b9d896675-7lskv\" (UID: \"97fe64f0-9f87-4b25-876e-a59829b69c04\") " pod="openshift-image-registry/image-registry-5b9d896675-7lskv" Apr 17 11:16:10.686241 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.686174 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n84q\" (UniqueName: \"kubernetes.io/projected/4dd7876b-a6b7-4cf0-b645-979aead5bdff-kube-api-access-8n84q\") pod \"dns-default-tkcmp\" (UID: \"4dd7876b-a6b7-4cf0-b645-979aead5bdff\") " pod="openshift-dns/dns-default-tkcmp" Apr 17 11:16:10.686241 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.686198 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/97fe64f0-9f87-4b25-876e-a59829b69c04-registry-certificates\") pod \"image-registry-5b9d896675-7lskv\" (UID: \"97fe64f0-9f87-4b25-876e-a59829b69c04\") " pod="openshift-image-registry/image-registry-5b9d896675-7lskv" Apr 17 11:16:10.686858 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.686253 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw7bt\" (UniqueName: \"kubernetes.io/projected/05a441fa-9d9b-40d1-adfd-ffe296dfb2d0-kube-api-access-nw7bt\") pod \"ingress-canary-sswv7\" (UID: \"05a441fa-9d9b-40d1-adfd-ffe296dfb2d0\") " pod="openshift-ingress-canary/ingress-canary-sswv7" Apr 17 11:16:10.686858 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.686302 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/97fe64f0-9f87-4b25-876e-a59829b69c04-installation-pull-secrets\") pod \"image-registry-5b9d896675-7lskv\" (UID: \"97fe64f0-9f87-4b25-876e-a59829b69c04\") " pod="openshift-image-registry/image-registry-5b9d896675-7lskv" Apr 17 11:16:10.686858 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.686328 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2778730-467e-4432-9a1a-d5f871276f6d-service-ca-bundle\") pod \"insights-operator-585dfdc468-clmj2\" (UID: \"e2778730-467e-4432-9a1a-d5f871276f6d\") " pod="openshift-insights/insights-operator-585dfdc468-clmj2" Apr 17 11:16:10.686858 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.686352 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/beb961a2-0877-4cd8-be68-062f895cef5d-registry-tls\") pod \"image-registry-799db9ddcb-l92ps\" (UID: \"beb961a2-0877-4cd8-be68-062f895cef5d\") " pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" Apr 17 11:16:10.686858 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.686375 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d90dfee2-676e-4224-8ec8-8d764b523802-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-xjjnj\" (UID: \"d90dfee2-676e-4224-8ec8-8d764b523802\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xjjnj" Apr 17 11:16:10.686858 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.686402 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/beb961a2-0877-4cd8-be68-062f895cef5d-registry-certificates\") pod \"image-registry-799db9ddcb-l92ps\" (UID: \"beb961a2-0877-4cd8-be68-062f895cef5d\") " pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" Apr 17 11:16:10.687415 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.687389 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/cfa96c8c-1c5c-4749-a74a-b6eab2274afd-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-65j5q\" (UID: \"cfa96c8c-1c5c-4749-a74a-b6eab2274afd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-65j5q" Apr 17 11:16:10.690027 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.690004 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/97fe64f0-9f87-4b25-876e-a59829b69c04-image-registry-private-configuration\") pod \"image-registry-5b9d896675-7lskv\" (UID: \"97fe64f0-9f87-4b25-876e-a59829b69c04\") " pod="openshift-image-registry/image-registry-5b9d896675-7lskv" Apr 17 11:16:10.690027 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.689999 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/97fe64f0-9f87-4b25-876e-a59829b69c04-installation-pull-secrets\") pod \"image-registry-5b9d896675-7lskv\" (UID: \"97fe64f0-9f87-4b25-876e-a59829b69c04\") " pod="openshift-image-registry/image-registry-5b9d896675-7lskv" Apr 17 11:16:10.690327 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.690296 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44d4ecc4-97ed-4995-a7f7-5f731f3fe770-serving-cert\") pod \"service-ca-operator-d6fc45fc5-ds9gs\" (UID: \"44d4ecc4-97ed-4995-a7f7-5f731f3fe770\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ds9gs" Apr 17 11:16:10.699124 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.698919 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-47nt5" Apr 17 11:16:10.699124 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.698952 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9g7pq" Apr 17 11:16:10.700065 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.700039 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b86b6\" (UniqueName: \"kubernetes.io/projected/a3e70c72-ac65-4a09-b59a-570bf07a6dbb-kube-api-access-b86b6\") pod \"cluster-samples-operator-6dc5bdb6b4-7lqsd\" (UID: \"a3e70c72-ac65-4a09-b59a-570bf07a6dbb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7lqsd" Apr 17 11:16:10.701242 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.701191 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2dxm\" (UniqueName: \"kubernetes.io/projected/97fe64f0-9f87-4b25-876e-a59829b69c04-kube-api-access-r2dxm\") pod \"image-registry-5b9d896675-7lskv\" (UID: \"97fe64f0-9f87-4b25-876e-a59829b69c04\") " pod="openshift-image-registry/image-registry-5b9d896675-7lskv" Apr 17 11:16:10.701740 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.701498 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 11:16:10.701740 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.701682 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-x2vcd\"" Apr 17 11:16:10.701976 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.701937 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-98wmj\"" Apr 17 11:16:10.702480 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.702441 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7fwr\" (UniqueName: \"kubernetes.io/projected/8a1529e3-e492-461e-9d34-440e5555b197-kube-api-access-p7fwr\") pod \"network-check-source-8894fc9bd-fdqlp\" (UID: \"8a1529e3-e492-461e-9d34-440e5555b197\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-fdqlp" Apr 17 11:16:10.702677 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.702653 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h88t6\" (UniqueName: \"kubernetes.io/projected/92e7a74f-35b8-4f70-833b-1261a2bed50d-kube-api-access-h88t6\") pod \"volume-data-source-validator-7c6cbb6c87-dl54m\" (UID: \"92e7a74f-35b8-4f70-833b-1261a2bed50d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dl54m" Apr 17 11:16:10.704263 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.704227 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97fe64f0-9f87-4b25-876e-a59829b69c04-bound-sa-token\") pod \"image-registry-5b9d896675-7lskv\" (UID: \"97fe64f0-9f87-4b25-876e-a59829b69c04\") " pod="openshift-image-registry/image-registry-5b9d896675-7lskv" Apr 17 11:16:10.704962 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.704924 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp6rx\" (UniqueName: \"kubernetes.io/projected/44d4ecc4-97ed-4995-a7f7-5f731f3fe770-kube-api-access-bp6rx\") pod \"service-ca-operator-d6fc45fc5-ds9gs\" (UID: \"44d4ecc4-97ed-4995-a7f7-5f731f3fe770\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ds9gs" Apr 17 11:16:10.706584 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.706558 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7txqs\" (UniqueName: \"kubernetes.io/projected/cfa96c8c-1c5c-4749-a74a-b6eab2274afd-kube-api-access-7txqs\") pod \"cluster-monitoring-operator-75587bd455-65j5q\" (UID: \"cfa96c8c-1c5c-4749-a74a-b6eab2274afd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-65j5q" Apr 17 11:16:10.787632 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.787544 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxs58\" (UniqueName: \"kubernetes.io/projected/b2c84c0f-0d60-4465-b1fd-4f39963e95d4-kube-api-access-wxs58\") pod \"kube-storage-version-migrator-operator-6769c5d45-6hppm\" (UID: \"b2c84c0f-0d60-4465-b1fd-4f39963e95d4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6hppm" Apr 17 11:16:10.787632 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.787596 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d90dfee2-676e-4224-8ec8-8d764b523802-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-xjjnj\" (UID: \"d90dfee2-676e-4224-8ec8-8d764b523802\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xjjnj" Apr 17 11:16:10.787632 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.787626 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/beb961a2-0877-4cd8-be68-062f895cef5d-bound-sa-token\") pod \"image-registry-799db9ddcb-l92ps\" (UID: \"beb961a2-0877-4cd8-be68-062f895cef5d\") " pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" Apr 17 11:16:10.787968 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.787656 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4dd7876b-a6b7-4cf0-b645-979aead5bdff-tmp-dir\") pod \"dns-default-tkcmp\" (UID: \"4dd7876b-a6b7-4cf0-b645-979aead5bdff\") " pod="openshift-dns/dns-default-tkcmp" Apr 17 11:16:10.787968 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.787688 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z9wpx\" (UniqueName: \"kubernetes.io/projected/e2778730-467e-4432-9a1a-d5f871276f6d-kube-api-access-z9wpx\") pod \"insights-operator-585dfdc468-clmj2\" (UID: \"e2778730-467e-4432-9a1a-d5f871276f6d\") " pod="openshift-insights/insights-operator-585dfdc468-clmj2" Apr 17 11:16:10.787968 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.787715 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-default-certificate\") pod \"router-default-7657d8478-pp7qf\" (UID: \"615b2482-c02f-4752-9ea6-7e7cef5c1fe9\") " pod="openshift-ingress/router-default-7657d8478-pp7qf" Apr 17 11:16:10.787968 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:10.787731 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 11:16:10.787968 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.787748 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/beb961a2-0877-4cd8-be68-062f895cef5d-installation-pull-secrets\") pod \"image-registry-799db9ddcb-l92ps\" (UID: \"beb961a2-0877-4cd8-be68-062f895cef5d\") " pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" Apr 17 11:16:10.787968 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.787791 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4dd7876b-a6b7-4cf0-b645-979aead5bdff-config-volume\") pod \"dns-default-tkcmp\" (UID: \"4dd7876b-a6b7-4cf0-b645-979aead5bdff\") " pod="openshift-dns/dns-default-tkcmp" Apr 17 11:16:10.787968 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:10.787828 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d90dfee2-676e-4224-8ec8-8d764b523802-networking-console-plugin-cert podName:d90dfee2-676e-4224-8ec8-8d764b523802 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:11.287808851 +0000 UTC m=+34.154332980 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d90dfee2-676e-4224-8ec8-8d764b523802-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-xjjnj" (UID: "d90dfee2-676e-4224-8ec8-8d764b523802") : secret "networking-console-plugin-cert" not found Apr 17 11:16:10.787968 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.787856 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2c84c0f-0d60-4465-b1fd-4f39963e95d4-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-6hppm\" (UID: \"b2c84c0f-0d60-4465-b1fd-4f39963e95d4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6hppm" Apr 17 11:16:10.787968 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.787902 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4dd7876b-a6b7-4cf0-b645-979aead5bdff-metrics-tls\") pod \"dns-default-tkcmp\" (UID: \"4dd7876b-a6b7-4cf0-b645-979aead5bdff\") " pod="openshift-dns/dns-default-tkcmp" Apr 17 11:16:10.787968 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.787937 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/e2778730-467e-4432-9a1a-d5f871276f6d-snapshots\") pod \"insights-operator-585dfdc468-clmj2\" (UID: \"e2778730-467e-4432-9a1a-d5f871276f6d\") " pod="openshift-insights/insights-operator-585dfdc468-clmj2" Apr 17 11:16:10.788398 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.787982 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-metrics-certs\") pod \"router-default-7657d8478-pp7qf\" (UID: \"615b2482-c02f-4752-9ea6-7e7cef5c1fe9\") " pod="openshift-ingress/router-default-7657d8478-pp7qf" Apr 17 11:16:10.788398 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.788019 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fcwtp\" (UniqueName: \"kubernetes.io/projected/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-kube-api-access-fcwtp\") pod \"router-default-7657d8478-pp7qf\" (UID: \"615b2482-c02f-4752-9ea6-7e7cef5c1fe9\") " pod="openshift-ingress/router-default-7657d8478-pp7qf" Apr 17 11:16:10.788398 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.788046 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mwlm\" (UniqueName: \"kubernetes.io/projected/beb961a2-0877-4cd8-be68-062f895cef5d-kube-api-access-4mwlm\") pod \"image-registry-799db9ddcb-l92ps\" (UID: \"beb961a2-0877-4cd8-be68-062f895cef5d\") " pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" Apr 17 11:16:10.788398 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.788077 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89b9086c-55f6-4d0b-a998-d22a793d7d17-serving-cert\") pod \"console-operator-9d4b6777b-9wmp5\" (UID: \"89b9086c-55f6-4d0b-a998-d22a793d7d17\") " pod="openshift-console-operator/console-operator-9d4b6777b-9wmp5" Apr 17 11:16:10.788398 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.788083 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4dd7876b-a6b7-4cf0-b645-979aead5bdff-tmp-dir\") pod \"dns-default-tkcmp\" (UID: \"4dd7876b-a6b7-4cf0-b645-979aead5bdff\") " pod="openshift-dns/dns-default-tkcmp" Apr 17 11:16:10.788398 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.788104 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e2778730-467e-4432-9a1a-d5f871276f6d-tmp\") pod \"insights-operator-585dfdc468-clmj2\" (UID: \"e2778730-467e-4432-9a1a-d5f871276f6d\") " pod="openshift-insights/insights-operator-585dfdc468-clmj2" Apr 17 11:16:10.788398 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.788131 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2778730-467e-4432-9a1a-d5f871276f6d-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-clmj2\" (UID: \"e2778730-467e-4432-9a1a-d5f871276f6d\") " pod="openshift-insights/insights-operator-585dfdc468-clmj2" Apr 17 11:16:10.788398 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.788158 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/beb961a2-0877-4cd8-be68-062f895cef5d-ca-trust-extracted\") pod \"image-registry-799db9ddcb-l92ps\" (UID: \"beb961a2-0877-4cd8-be68-062f895cef5d\") " pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" Apr 17 11:16:10.788398 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.788201 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05a441fa-9d9b-40d1-adfd-ffe296dfb2d0-cert\") pod \"ingress-canary-sswv7\" (UID: \"05a441fa-9d9b-40d1-adfd-ffe296dfb2d0\") " pod="openshift-ingress-canary/ingress-canary-sswv7" Apr 17 11:16:10.788398 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.788224 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/89b9086c-55f6-4d0b-a998-d22a793d7d17-trusted-ca\") pod \"console-operator-9d4b6777b-9wmp5\" (UID: \"89b9086c-55f6-4d0b-a998-d22a793d7d17\") " pod="openshift-console-operator/console-operator-9d4b6777b-9wmp5" Apr 17 11:16:10.788398 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.788246 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2778730-467e-4432-9a1a-d5f871276f6d-serving-cert\") pod \"insights-operator-585dfdc468-clmj2\" (UID: \"e2778730-467e-4432-9a1a-d5f871276f6d\") " pod="openshift-insights/insights-operator-585dfdc468-clmj2" Apr 17 11:16:10.788398 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.788261 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4dd7876b-a6b7-4cf0-b645-979aead5bdff-config-volume\") pod \"dns-default-tkcmp\" (UID: \"4dd7876b-a6b7-4cf0-b645-979aead5bdff\") " pod="openshift-dns/dns-default-tkcmp" Apr 17 11:16:10.788398 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.788274 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g8drh\" (UniqueName: \"kubernetes.io/projected/89b9086c-55f6-4d0b-a998-d22a793d7d17-kube-api-access-g8drh\") pod \"console-operator-9d4b6777b-9wmp5\" (UID: \"89b9086c-55f6-4d0b-a998-d22a793d7d17\") " pod="openshift-console-operator/console-operator-9d4b6777b-9wmp5" Apr 17 11:16:10.788398 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.788339 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2c84c0f-0d60-4465-b1fd-4f39963e95d4-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-6hppm\" (UID: \"b2c84c0f-0d60-4465-b1fd-4f39963e95d4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6hppm" Apr 17 11:16:10.788398 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:10.788359 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:10.788398 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.788377 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8n84q\" (UniqueName: \"kubernetes.io/projected/4dd7876b-a6b7-4cf0-b645-979aead5bdff-kube-api-access-8n84q\") pod \"dns-default-tkcmp\" (UID: \"4dd7876b-a6b7-4cf0-b645-979aead5bdff\") " pod="openshift-dns/dns-default-tkcmp" Apr 17 11:16:10.789129 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:10.788406 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4dd7876b-a6b7-4cf0-b645-979aead5bdff-metrics-tls podName:4dd7876b-a6b7-4cf0-b645-979aead5bdff nodeName:}" failed. No retries permitted until 2026-04-17 11:16:11.288389815 +0000 UTC m=+34.154913948 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4dd7876b-a6b7-4cf0-b645-979aead5bdff-metrics-tls") pod "dns-default-tkcmp" (UID: "4dd7876b-a6b7-4cf0-b645-979aead5bdff") : secret "dns-default-metrics-tls" not found Apr 17 11:16:10.789129 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.788446 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nw7bt\" (UniqueName: \"kubernetes.io/projected/05a441fa-9d9b-40d1-adfd-ffe296dfb2d0-kube-api-access-nw7bt\") pod \"ingress-canary-sswv7\" (UID: \"05a441fa-9d9b-40d1-adfd-ffe296dfb2d0\") " pod="openshift-ingress-canary/ingress-canary-sswv7" Apr 17 11:16:10.789129 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.788488 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2778730-467e-4432-9a1a-d5f871276f6d-service-ca-bundle\") pod \"insights-operator-585dfdc468-clmj2\" (UID: \"e2778730-467e-4432-9a1a-d5f871276f6d\") " pod="openshift-insights/insights-operator-585dfdc468-clmj2" Apr 17 11:16:10.789129 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.788515 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/beb961a2-0877-4cd8-be68-062f895cef5d-registry-tls\") pod \"image-registry-799db9ddcb-l92ps\" (UID: \"beb961a2-0877-4cd8-be68-062f895cef5d\") " pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" Apr 17 11:16:10.789129 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.788541 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d90dfee2-676e-4224-8ec8-8d764b523802-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-xjjnj\" (UID: \"d90dfee2-676e-4224-8ec8-8d764b523802\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xjjnj" Apr 17 11:16:10.789129 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.788568 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/beb961a2-0877-4cd8-be68-062f895cef5d-registry-certificates\") pod \"image-registry-799db9ddcb-l92ps\" (UID: \"beb961a2-0877-4cd8-be68-062f895cef5d\") " pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" Apr 17 11:16:10.789129 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.788607 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-stats-auth\") pod \"router-default-7657d8478-pp7qf\" (UID: \"615b2482-c02f-4752-9ea6-7e7cef5c1fe9\") " pod="openshift-ingress/router-default-7657d8478-pp7qf" Apr 17 11:16:10.789129 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.788631 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/beb961a2-0877-4cd8-be68-062f895cef5d-trusted-ca\") pod \"image-registry-799db9ddcb-l92ps\" (UID: \"beb961a2-0877-4cd8-be68-062f895cef5d\") " pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" Apr 17 11:16:10.789129 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.788677 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/beb961a2-0877-4cd8-be68-062f895cef5d-image-registry-private-configuration\") pod \"image-registry-799db9ddcb-l92ps\" (UID: \"beb961a2-0877-4cd8-be68-062f895cef5d\") " pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" Apr 17 11:16:10.789129 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.788681 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/beb961a2-0877-4cd8-be68-062f895cef5d-ca-trust-extracted\") pod \"image-registry-799db9ddcb-l92ps\" (UID: \"beb961a2-0877-4cd8-be68-062f895cef5d\") " pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" Apr 17 11:16:10.789129 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.788723 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-service-ca-bundle\") pod \"router-default-7657d8478-pp7qf\" (UID: \"615b2482-c02f-4752-9ea6-7e7cef5c1fe9\") " pod="openshift-ingress/router-default-7657d8478-pp7qf" Apr 17 11:16:10.789129 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.788730 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/e2778730-467e-4432-9a1a-d5f871276f6d-snapshots\") pod \"insights-operator-585dfdc468-clmj2\" (UID: \"e2778730-467e-4432-9a1a-d5f871276f6d\") " pod="openshift-insights/insights-operator-585dfdc468-clmj2" Apr 17 11:16:10.789129 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:10.788805 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 11:16:10.789129 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:10.788852 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-metrics-certs podName:615b2482-c02f-4752-9ea6-7e7cef5c1fe9 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:11.288838073 +0000 UTC m=+34.155362218 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-metrics-certs") pod "router-default-7657d8478-pp7qf" (UID: "615b2482-c02f-4752-9ea6-7e7cef5c1fe9") : secret "router-metrics-certs-default" not found Apr 17 11:16:10.789129 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:10.788866 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:16:10.789129 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:10.788880 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-799db9ddcb-l92ps: secret "image-registry-tls" not found Apr 17 11:16:10.789852 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.788883 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89b9086c-55f6-4d0b-a998-d22a793d7d17-config\") pod \"console-operator-9d4b6777b-9wmp5\" (UID: \"89b9086c-55f6-4d0b-a998-d22a793d7d17\") " pod="openshift-console-operator/console-operator-9d4b6777b-9wmp5" Apr 17 11:16:10.789852 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:10.788920 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/beb961a2-0877-4cd8-be68-062f895cef5d-registry-tls podName:beb961a2-0877-4cd8-be68-062f895cef5d nodeName:}" failed. No retries permitted until 2026-04-17 11:16:11.288906187 +0000 UTC m=+34.155430316 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/beb961a2-0877-4cd8-be68-062f895cef5d-registry-tls") pod "image-registry-799db9ddcb-l92ps" (UID: "beb961a2-0877-4cd8-be68-062f895cef5d") : secret "image-registry-tls" not found Apr 17 11:16:10.789852 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:10.789071 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-service-ca-bundle podName:615b2482-c02f-4752-9ea6-7e7cef5c1fe9 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:11.289052913 +0000 UTC m=+34.155577072 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-service-ca-bundle") pod "router-default-7657d8478-pp7qf" (UID: "615b2482-c02f-4752-9ea6-7e7cef5c1fe9") : configmap references non-existent config key: service-ca.crt Apr 17 11:16:10.789852 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.789107 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e2778730-467e-4432-9a1a-d5f871276f6d-tmp\") pod \"insights-operator-585dfdc468-clmj2\" (UID: \"e2778730-467e-4432-9a1a-d5f871276f6d\") " pod="openshift-insights/insights-operator-585dfdc468-clmj2" Apr 17 11:16:10.790700 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.790632 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d90dfee2-676e-4224-8ec8-8d764b523802-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-xjjnj\" (UID: \"d90dfee2-676e-4224-8ec8-8d764b523802\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xjjnj" Apr 17 11:16:10.790910 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.790888 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/beb961a2-0877-4cd8-be68-062f895cef5d-trusted-ca\") pod \"image-registry-799db9ddcb-l92ps\" (UID: \"beb961a2-0877-4cd8-be68-062f895cef5d\") " pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" Apr 17 11:16:10.791109 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.791087 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2c84c0f-0d60-4465-b1fd-4f39963e95d4-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-6hppm\" (UID: \"b2c84c0f-0d60-4465-b1fd-4f39963e95d4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6hppm" Apr 17 11:16:10.791433 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.791411 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/beb961a2-0877-4cd8-be68-062f895cef5d-registry-certificates\") pod \"image-registry-799db9ddcb-l92ps\" (UID: \"beb961a2-0877-4cd8-be68-062f895cef5d\") " pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" Apr 17 11:16:10.792922 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.791574 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2778730-467e-4432-9a1a-d5f871276f6d-serving-cert\") pod \"insights-operator-585dfdc468-clmj2\" (UID: \"e2778730-467e-4432-9a1a-d5f871276f6d\") " pod="openshift-insights/insights-operator-585dfdc468-clmj2" Apr 17 11:16:10.792922 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.791667 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89b9086c-55f6-4d0b-a998-d22a793d7d17-config\") pod \"console-operator-9d4b6777b-9wmp5\" (UID: \"89b9086c-55f6-4d0b-a998-d22a793d7d17\") " pod="openshift-console-operator/console-operator-9d4b6777b-9wmp5" Apr 17 11:16:10.792922 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:10.791697 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:10.792922 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:10.791790 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05a441fa-9d9b-40d1-adfd-ffe296dfb2d0-cert podName:05a441fa-9d9b-40d1-adfd-ffe296dfb2d0 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:11.291756153 +0000 UTC m=+34.158280294 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/05a441fa-9d9b-40d1-adfd-ffe296dfb2d0-cert") pod "ingress-canary-sswv7" (UID: "05a441fa-9d9b-40d1-adfd-ffe296dfb2d0") : secret "canary-serving-cert" not found Apr 17 11:16:10.792922 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.792000 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/beb961a2-0877-4cd8-be68-062f895cef5d-installation-pull-secrets\") pod \"image-registry-799db9ddcb-l92ps\" (UID: \"beb961a2-0877-4cd8-be68-062f895cef5d\") " pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" Apr 17 11:16:10.792922 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.792168 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-default-certificate\") pod \"router-default-7657d8478-pp7qf\" (UID: \"615b2482-c02f-4752-9ea6-7e7cef5c1fe9\") " pod="openshift-ingress/router-default-7657d8478-pp7qf" Apr 17 11:16:10.792922 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.792405 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2c84c0f-0d60-4465-b1fd-4f39963e95d4-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-6hppm\" (UID: \"b2c84c0f-0d60-4465-b1fd-4f39963e95d4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6hppm" Apr 17 11:16:10.792922 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.792526 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2778730-467e-4432-9a1a-d5f871276f6d-service-ca-bundle\") pod \"insights-operator-585dfdc468-clmj2\" (UID: \"e2778730-467e-4432-9a1a-d5f871276f6d\") " pod="openshift-insights/insights-operator-585dfdc468-clmj2" Apr 17 11:16:10.793596 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.793567 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2778730-467e-4432-9a1a-d5f871276f6d-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-clmj2\" (UID: \"e2778730-467e-4432-9a1a-d5f871276f6d\") " pod="openshift-insights/insights-operator-585dfdc468-clmj2" Apr 17 11:16:10.794042 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.794018 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-stats-auth\") pod \"router-default-7657d8478-pp7qf\" (UID: \"615b2482-c02f-4752-9ea6-7e7cef5c1fe9\") " pod="openshift-ingress/router-default-7657d8478-pp7qf" Apr 17 11:16:10.795328 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.795300 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/89b9086c-55f6-4d0b-a998-d22a793d7d17-trusted-ca\") pod \"console-operator-9d4b6777b-9wmp5\" (UID: \"89b9086c-55f6-4d0b-a998-d22a793d7d17\") " pod="openshift-console-operator/console-operator-9d4b6777b-9wmp5" Apr 17 11:16:10.796745 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.796719 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/beb961a2-0877-4cd8-be68-062f895cef5d-image-registry-private-configuration\") pod \"image-registry-799db9ddcb-l92ps\" (UID: \"beb961a2-0877-4cd8-be68-062f895cef5d\") " pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" Apr 17 11:16:10.796848 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.796784 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/beb961a2-0877-4cd8-be68-062f895cef5d-bound-sa-token\") pod \"image-registry-799db9ddcb-l92ps\" (UID: \"beb961a2-0877-4cd8-be68-062f895cef5d\") " pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" Apr 17 11:16:10.797030 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.797011 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89b9086c-55f6-4d0b-a998-d22a793d7d17-serving-cert\") pod \"console-operator-9d4b6777b-9wmp5\" (UID: \"89b9086c-55f6-4d0b-a998-d22a793d7d17\") " pod="openshift-console-operator/console-operator-9d4b6777b-9wmp5" Apr 17 11:16:10.800826 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.800799 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxs58\" (UniqueName: \"kubernetes.io/projected/b2c84c0f-0d60-4465-b1fd-4f39963e95d4-kube-api-access-wxs58\") pod \"kube-storage-version-migrator-operator-6769c5d45-6hppm\" (UID: \"b2c84c0f-0d60-4465-b1fd-4f39963e95d4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6hppm" Apr 17 11:16:10.801733 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.801691 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9wpx\" (UniqueName: \"kubernetes.io/projected/e2778730-467e-4432-9a1a-d5f871276f6d-kube-api-access-z9wpx\") pod \"insights-operator-585dfdc468-clmj2\" (UID: \"e2778730-467e-4432-9a1a-d5f871276f6d\") " pod="openshift-insights/insights-operator-585dfdc468-clmj2" Apr 17 11:16:10.802406 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.802310 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcwtp\" (UniqueName: \"kubernetes.io/projected/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-kube-api-access-fcwtp\") pod \"router-default-7657d8478-pp7qf\" (UID: \"615b2482-c02f-4752-9ea6-7e7cef5c1fe9\") " pod="openshift-ingress/router-default-7657d8478-pp7qf" Apr 17 11:16:10.802406 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.802355 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw7bt\" (UniqueName: \"kubernetes.io/projected/05a441fa-9d9b-40d1-adfd-ffe296dfb2d0-kube-api-access-nw7bt\") pod \"ingress-canary-sswv7\" (UID: \"05a441fa-9d9b-40d1-adfd-ffe296dfb2d0\") " pod="openshift-ingress-canary/ingress-canary-sswv7" Apr 17 11:16:10.802623 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.802603 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mwlm\" (UniqueName: \"kubernetes.io/projected/beb961a2-0877-4cd8-be68-062f895cef5d-kube-api-access-4mwlm\") pod \"image-registry-799db9ddcb-l92ps\" (UID: \"beb961a2-0877-4cd8-be68-062f895cef5d\") " pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" Apr 17 11:16:10.802896 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.802876 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n84q\" (UniqueName: \"kubernetes.io/projected/4dd7876b-a6b7-4cf0-b645-979aead5bdff-kube-api-access-8n84q\") pod \"dns-default-tkcmp\" (UID: \"4dd7876b-a6b7-4cf0-b645-979aead5bdff\") " pod="openshift-dns/dns-default-tkcmp" Apr 17 11:16:10.803677 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.803660 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8drh\" (UniqueName: \"kubernetes.io/projected/89b9086c-55f6-4d0b-a998-d22a793d7d17-kube-api-access-g8drh\") pod \"console-operator-9d4b6777b-9wmp5\" (UID: \"89b9086c-55f6-4d0b-a998-d22a793d7d17\") " pod="openshift-console-operator/console-operator-9d4b6777b-9wmp5" Apr 17 11:16:10.815127 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.815106 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-fdqlp" Apr 17 11:16:10.839064 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.839025 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ds9gs" Apr 17 11:16:10.856259 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.856231 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dl54m" Apr 17 11:16:10.864883 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.864859 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6hppm" Apr 17 11:16:10.896106 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.896076 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-clmj2" Apr 17 11:16:10.904799 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:10.904762 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-9wmp5" Apr 17 11:16:11.193689 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:11.193649 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/cfa96c8c-1c5c-4749-a74a-b6eab2274afd-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-65j5q\" (UID: \"cfa96c8c-1c5c-4749-a74a-b6eab2274afd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-65j5q" Apr 17 11:16:11.193893 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:11.193725 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97fe64f0-9f87-4b25-876e-a59829b69c04-registry-tls\") pod \"image-registry-5b9d896675-7lskv\" (UID: \"97fe64f0-9f87-4b25-876e-a59829b69c04\") " pod="openshift-image-registry/image-registry-5b9d896675-7lskv" Apr 17 11:16:11.193893 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:11.193804 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3e70c72-ac65-4a09-b59a-570bf07a6dbb-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7lqsd\" (UID: \"a3e70c72-ac65-4a09-b59a-570bf07a6dbb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7lqsd" Apr 17 11:16:11.193893 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:11.193831 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 11:16:11.194057 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:11.193894 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:16:11.194057 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:11.193904 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfa96c8c-1c5c-4749-a74a-b6eab2274afd-cluster-monitoring-operator-tls podName:cfa96c8c-1c5c-4749-a74a-b6eab2274afd nodeName:}" failed. No retries permitted until 2026-04-17 11:16:12.193884509 +0000 UTC m=+35.060408642 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/cfa96c8c-1c5c-4749-a74a-b6eab2274afd-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-65j5q" (UID: "cfa96c8c-1c5c-4749-a74a-b6eab2274afd") : secret "cluster-monitoring-operator-tls" not found Apr 17 11:16:11.194057 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:11.193912 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5b9d896675-7lskv: secret "image-registry-tls" not found Apr 17 11:16:11.194057 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:11.193944 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 11:16:11.194057 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:11.193961 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/97fe64f0-9f87-4b25-876e-a59829b69c04-registry-tls podName:97fe64f0-9f87-4b25-876e-a59829b69c04 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:12.193948755 +0000 UTC m=+35.060472902 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/97fe64f0-9f87-4b25-876e-a59829b69c04-registry-tls") pod "image-registry-5b9d896675-7lskv" (UID: "97fe64f0-9f87-4b25-876e-a59829b69c04") : secret "image-registry-tls" not found Apr 17 11:16:11.194057 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:11.193999 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3e70c72-ac65-4a09-b59a-570bf07a6dbb-samples-operator-tls podName:a3e70c72-ac65-4a09-b59a-570bf07a6dbb nodeName:}" failed. No retries permitted until 2026-04-17 11:16:12.193986815 +0000 UTC m=+35.060510947 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/a3e70c72-ac65-4a09-b59a-570bf07a6dbb-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-7lqsd" (UID: "a3e70c72-ac65-4a09-b59a-570bf07a6dbb") : secret "samples-operator-tls" not found Apr 17 11:16:11.294698 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:11.294656 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-metrics-certs\") pod \"router-default-7657d8478-pp7qf\" (UID: \"615b2482-c02f-4752-9ea6-7e7cef5c1fe9\") " pod="openshift-ingress/router-default-7657d8478-pp7qf" Apr 17 11:16:11.294890 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:11.294730 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05a441fa-9d9b-40d1-adfd-ffe296dfb2d0-cert\") pod \"ingress-canary-sswv7\" (UID: \"05a441fa-9d9b-40d1-adfd-ffe296dfb2d0\") " pod="openshift-ingress-canary/ingress-canary-sswv7" Apr 17 11:16:11.294890 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:11.294806 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/beb961a2-0877-4cd8-be68-062f895cef5d-registry-tls\") pod \"image-registry-799db9ddcb-l92ps\" (UID: \"beb961a2-0877-4cd8-be68-062f895cef5d\") " pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" Apr 17 11:16:11.294890 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:11.294841 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 11:16:11.294890 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:11.294865 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-service-ca-bundle\") pod \"router-default-7657d8478-pp7qf\" (UID: \"615b2482-c02f-4752-9ea6-7e7cef5c1fe9\") " pod="openshift-ingress/router-default-7657d8478-pp7qf" Apr 17 11:16:11.295086 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:11.294914 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-metrics-certs podName:615b2482-c02f-4752-9ea6-7e7cef5c1fe9 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:12.294895755 +0000 UTC m=+35.161419883 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-metrics-certs") pod "router-default-7657d8478-pp7qf" (UID: "615b2482-c02f-4752-9ea6-7e7cef5c1fe9") : secret "router-metrics-certs-default" not found Apr 17 11:16:11.295086 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:11.294916 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:11.295086 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:11.294941 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d90dfee2-676e-4224-8ec8-8d764b523802-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-xjjnj\" (UID: \"d90dfee2-676e-4224-8ec8-8d764b523802\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xjjnj" Apr 17 11:16:11.295086 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:11.294997 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:16:11.295086 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:11.295015 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-799db9ddcb-l92ps: secret "image-registry-tls" not found Apr 17 11:16:11.295086 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:11.295025 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 11:16:11.295086 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:11.294990 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05a441fa-9d9b-40d1-adfd-ffe296dfb2d0-cert podName:05a441fa-9d9b-40d1-adfd-ffe296dfb2d0 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:12.294973583 +0000 UTC m=+35.161497711 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/05a441fa-9d9b-40d1-adfd-ffe296dfb2d0-cert") pod "ingress-canary-sswv7" (UID: "05a441fa-9d9b-40d1-adfd-ffe296dfb2d0") : secret "canary-serving-cert" not found Apr 17 11:16:11.295086 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:11.295064 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/beb961a2-0877-4cd8-be68-062f895cef5d-registry-tls podName:beb961a2-0877-4cd8-be68-062f895cef5d nodeName:}" failed. No retries permitted until 2026-04-17 11:16:12.295052343 +0000 UTC m=+35.161576474 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/beb961a2-0877-4cd8-be68-062f895cef5d-registry-tls") pod "image-registry-799db9ddcb-l92ps" (UID: "beb961a2-0877-4cd8-be68-062f895cef5d") : secret "image-registry-tls" not found Apr 17 11:16:11.295086 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:11.295078 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d90dfee2-676e-4224-8ec8-8d764b523802-networking-console-plugin-cert podName:d90dfee2-676e-4224-8ec8-8d764b523802 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:12.295070699 +0000 UTC m=+35.161594826 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d90dfee2-676e-4224-8ec8-8d764b523802-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-xjjnj" (UID: "d90dfee2-676e-4224-8ec8-8d764b523802") : secret "networking-console-plugin-cert" not found Apr 17 11:16:11.295086 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:11.295092 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-service-ca-bundle podName:615b2482-c02f-4752-9ea6-7e7cef5c1fe9 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:12.295087188 +0000 UTC m=+35.161611315 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-service-ca-bundle") pod "router-default-7657d8478-pp7qf" (UID: "615b2482-c02f-4752-9ea6-7e7cef5c1fe9") : configmap references non-existent config key: service-ca.crt Apr 17 11:16:11.295443 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:11.295137 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4dd7876b-a6b7-4cf0-b645-979aead5bdff-metrics-tls\") pod \"dns-default-tkcmp\" (UID: \"4dd7876b-a6b7-4cf0-b645-979aead5bdff\") " pod="openshift-dns/dns-default-tkcmp" Apr 17 11:16:11.295443 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:11.295203 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:11.295443 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:11.295227 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4dd7876b-a6b7-4cf0-b645-979aead5bdff-metrics-tls podName:4dd7876b-a6b7-4cf0-b645-979aead5bdff nodeName:}" failed. No retries permitted until 2026-04-17 11:16:12.295221402 +0000 UTC m=+35.161745530 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4dd7876b-a6b7-4cf0-b645-979aead5bdff-metrics-tls") pod "dns-default-tkcmp" (UID: "4dd7876b-a6b7-4cf0-b645-979aead5bdff") : secret "dns-default-metrics-tls" not found Apr 17 11:16:11.396429 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:11.396393 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0ba74b24-e523-481e-82b5-080dc7ecb2e2-metrics-certs\") pod \"network-metrics-daemon-9g7pq\" (UID: \"0ba74b24-e523-481e-82b5-080dc7ecb2e2\") " pod="openshift-multus/network-metrics-daemon-9g7pq" Apr 17 11:16:11.396595 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:11.396452 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lfdz4\" (UniqueName: \"kubernetes.io/projected/bfa20876-9d47-42bf-aad5-24503e05b86e-kube-api-access-lfdz4\") pod \"network-check-target-47nt5\" (UID: \"bfa20876-9d47-42bf-aad5-24503e05b86e\") " pod="openshift-network-diagnostics/network-check-target-47nt5" Apr 17 11:16:11.396595 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:11.396543 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 11:16:11.396715 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:11.396617 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ba74b24-e523-481e-82b5-080dc7ecb2e2-metrics-certs podName:0ba74b24-e523-481e-82b5-080dc7ecb2e2 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:43.396598233 +0000 UTC m=+66.263122381 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0ba74b24-e523-481e-82b5-080dc7ecb2e2-metrics-certs") pod "network-metrics-daemon-9g7pq" (UID: "0ba74b24-e523-481e-82b5-080dc7ecb2e2") : secret "metrics-daemon-secret" not found Apr 17 11:16:11.399140 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:11.399116 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfdz4\" (UniqueName: \"kubernetes.io/projected/bfa20876-9d47-42bf-aad5-24503e05b86e-kube-api-access-lfdz4\") pod \"network-check-target-47nt5\" (UID: \"bfa20876-9d47-42bf-aad5-24503e05b86e\") " pod="openshift-network-diagnostics/network-check-target-47nt5" Apr 17 11:16:11.629526 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:11.629445 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-47nt5" Apr 17 11:16:12.118614 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:12.118436 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-fdqlp"] Apr 17 11:16:12.125740 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:12.125701 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6hppm"] Apr 17 11:16:12.127089 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:12.127068 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-clmj2"] Apr 17 11:16:12.147935 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:12.147903 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ds9gs"] Apr 17 11:16:12.151926 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:12.151724 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-9wmp5"] Apr 17 11:16:12.152841 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:12.152804 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-47nt5"] Apr 17 11:16:12.153602 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:12.153585 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dl54m"] Apr 17 11:16:12.166964 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:16:12.166937 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a1529e3_e492_461e_9d34_440e5555b197.slice/crio-a00737a3d2685a43bb8b63b79d7eba09b5524fce74f438b89919d113dd07918d WatchSource:0}: Error finding container a00737a3d2685a43bb8b63b79d7eba09b5524fce74f438b89919d113dd07918d: Status 404 returned error can't find the container with id a00737a3d2685a43bb8b63b79d7eba09b5524fce74f438b89919d113dd07918d Apr 17 11:16:12.167391 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:16:12.167353 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2778730_467e_4432_9a1a_d5f871276f6d.slice/crio-84930979321b02db31a87b92081db0d756d82650c617f42ea4fc3769927f980e WatchSource:0}: Error finding container 84930979321b02db31a87b92081db0d756d82650c617f42ea4fc3769927f980e: Status 404 returned error can't find the container with id 84930979321b02db31a87b92081db0d756d82650c617f42ea4fc3769927f980e Apr 17 11:16:12.168652 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:16:12.168391 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2c84c0f_0d60_4465_b1fd_4f39963e95d4.slice/crio-feee9591176a70ee650a7acd559cb081141f8245f702ece48ceb240e2121c32b WatchSource:0}: Error finding container feee9591176a70ee650a7acd559cb081141f8245f702ece48ceb240e2121c32b: Status 404 returned error can't find the container with id feee9591176a70ee650a7acd559cb081141f8245f702ece48ceb240e2121c32b Apr 17 11:16:12.170092 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:16:12.170001 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44d4ecc4_97ed_4995_a7f7_5f731f3fe770.slice/crio-ad30e5c8f46acc4a6a2766fe5627aaedd6505766546c37a30fc5a4abb5b92871 WatchSource:0}: Error finding container ad30e5c8f46acc4a6a2766fe5627aaedd6505766546c37a30fc5a4abb5b92871: Status 404 returned error can't find the container with id ad30e5c8f46acc4a6a2766fe5627aaedd6505766546c37a30fc5a4abb5b92871 Apr 17 11:16:12.170721 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:16:12.170699 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfa20876_9d47_42bf_aad5_24503e05b86e.slice/crio-b1c76528b281809d3ac2350a7db155f894ee49e467898a8a3a734793f347970c WatchSource:0}: Error finding container b1c76528b281809d3ac2350a7db155f894ee49e467898a8a3a734793f347970c: Status 404 returned error can't find the container with id b1c76528b281809d3ac2350a7db155f894ee49e467898a8a3a734793f347970c Apr 17 11:16:12.177158 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:16:12.177138 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89b9086c_55f6_4d0b_a998_d22a793d7d17.slice/crio-d01c7a7dcb5fd8da34588b308854ddf25f47334ef20e5f0b6fee8611e72505ae WatchSource:0}: Error finding container d01c7a7dcb5fd8da34588b308854ddf25f47334ef20e5f0b6fee8611e72505ae: Status 404 returned error can't find the container with id d01c7a7dcb5fd8da34588b308854ddf25f47334ef20e5f0b6fee8611e72505ae Apr 17 11:16:12.178893 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:16:12.178870 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92e7a74f_35b8_4f70_833b_1261a2bed50d.slice/crio-0bf9e5494269bcf5ba6dbe869d5f114715918f2dda4cfb4f84f6678e4510e7cc WatchSource:0}: Error finding container 0bf9e5494269bcf5ba6dbe869d5f114715918f2dda4cfb4f84f6678e4510e7cc: Status 404 returned error can't find the container with id 0bf9e5494269bcf5ba6dbe869d5f114715918f2dda4cfb4f84f6678e4510e7cc Apr 17 11:16:12.205211 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:12.205186 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/cfa96c8c-1c5c-4749-a74a-b6eab2274afd-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-65j5q\" (UID: \"cfa96c8c-1c5c-4749-a74a-b6eab2274afd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-65j5q" Apr 17 11:16:12.205275 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:12.205263 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97fe64f0-9f87-4b25-876e-a59829b69c04-registry-tls\") pod \"image-registry-5b9d896675-7lskv\" (UID: \"97fe64f0-9f87-4b25-876e-a59829b69c04\") " pod="openshift-image-registry/image-registry-5b9d896675-7lskv" Apr 17 11:16:12.205320 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:12.205302 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 11:16:12.205365 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:12.205326 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3e70c72-ac65-4a09-b59a-570bf07a6dbb-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7lqsd\" (UID: \"a3e70c72-ac65-4a09-b59a-570bf07a6dbb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7lqsd" Apr 17 11:16:12.205402 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:12.205371 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfa96c8c-1c5c-4749-a74a-b6eab2274afd-cluster-monitoring-operator-tls podName:cfa96c8c-1c5c-4749-a74a-b6eab2274afd nodeName:}" failed. No retries permitted until 2026-04-17 11:16:14.205352393 +0000 UTC m=+37.071876527 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/cfa96c8c-1c5c-4749-a74a-b6eab2274afd-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-65j5q" (UID: "cfa96c8c-1c5c-4749-a74a-b6eab2274afd") : secret "cluster-monitoring-operator-tls" not found Apr 17 11:16:12.205402 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:12.205386 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:16:12.205402 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:12.205400 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5b9d896675-7lskv: secret "image-registry-tls" not found Apr 17 11:16:12.205502 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:12.205402 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 11:16:12.205502 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:12.205443 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/97fe64f0-9f87-4b25-876e-a59829b69c04-registry-tls podName:97fe64f0-9f87-4b25-876e-a59829b69c04 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:14.205430021 +0000 UTC m=+37.071954162 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/97fe64f0-9f87-4b25-876e-a59829b69c04-registry-tls") pod "image-registry-5b9d896675-7lskv" (UID: "97fe64f0-9f87-4b25-876e-a59829b69c04") : secret "image-registry-tls" not found Apr 17 11:16:12.205502 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:12.205491 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3e70c72-ac65-4a09-b59a-570bf07a6dbb-samples-operator-tls podName:a3e70c72-ac65-4a09-b59a-570bf07a6dbb nodeName:}" failed. No retries permitted until 2026-04-17 11:16:14.205481472 +0000 UTC m=+37.072005602 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/a3e70c72-ac65-4a09-b59a-570bf07a6dbb-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-7lqsd" (UID: "a3e70c72-ac65-4a09-b59a-570bf07a6dbb") : secret "samples-operator-tls" not found Apr 17 11:16:12.306643 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:12.306614 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4dd7876b-a6b7-4cf0-b645-979aead5bdff-metrics-tls\") pod \"dns-default-tkcmp\" (UID: \"4dd7876b-a6b7-4cf0-b645-979aead5bdff\") " pod="openshift-dns/dns-default-tkcmp" Apr 17 11:16:12.306877 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:12.306649 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-metrics-certs\") pod \"router-default-7657d8478-pp7qf\" (UID: \"615b2482-c02f-4752-9ea6-7e7cef5c1fe9\") " pod="openshift-ingress/router-default-7657d8478-pp7qf" Apr 17 11:16:12.306877 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:12.306683 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05a441fa-9d9b-40d1-adfd-ffe296dfb2d0-cert\") pod \"ingress-canary-sswv7\" (UID: \"05a441fa-9d9b-40d1-adfd-ffe296dfb2d0\") " pod="openshift-ingress-canary/ingress-canary-sswv7" Apr 17 11:16:12.306877 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:12.306742 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/beb961a2-0877-4cd8-be68-062f895cef5d-registry-tls\") pod \"image-registry-799db9ddcb-l92ps\" (UID: \"beb961a2-0877-4cd8-be68-062f895cef5d\") " pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" Apr 17 11:16:12.306877 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:12.306753 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:12.306877 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:12.306787 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 11:16:12.306877 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:12.306804 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:12.306877 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:12.306839 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4dd7876b-a6b7-4cf0-b645-979aead5bdff-metrics-tls podName:4dd7876b-a6b7-4cf0-b645-979aead5bdff nodeName:}" failed. No retries permitted until 2026-04-17 11:16:14.306820502 +0000 UTC m=+37.173344640 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4dd7876b-a6b7-4cf0-b645-979aead5bdff-metrics-tls") pod "dns-default-tkcmp" (UID: "4dd7876b-a6b7-4cf0-b645-979aead5bdff") : secret "dns-default-metrics-tls" not found Apr 17 11:16:12.306877 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:12.306867 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:16:12.307241 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:12.306885 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-799db9ddcb-l92ps: secret "image-registry-tls" not found Apr 17 11:16:12.307241 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:12.306910 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-metrics-certs podName:615b2482-c02f-4752-9ea6-7e7cef5c1fe9 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:14.306887859 +0000 UTC m=+37.173412000 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-metrics-certs") pod "router-default-7657d8478-pp7qf" (UID: "615b2482-c02f-4752-9ea6-7e7cef5c1fe9") : secret "router-metrics-certs-default" not found Apr 17 11:16:12.307241 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:12.306948 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05a441fa-9d9b-40d1-adfd-ffe296dfb2d0-cert podName:05a441fa-9d9b-40d1-adfd-ffe296dfb2d0 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:14.306931851 +0000 UTC m=+37.173455993 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/05a441fa-9d9b-40d1-adfd-ffe296dfb2d0-cert") pod "ingress-canary-sswv7" (UID: "05a441fa-9d9b-40d1-adfd-ffe296dfb2d0") : secret "canary-serving-cert" not found Apr 17 11:16:12.307241 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:12.306971 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/beb961a2-0877-4cd8-be68-062f895cef5d-registry-tls podName:beb961a2-0877-4cd8-be68-062f895cef5d nodeName:}" failed. No retries permitted until 2026-04-17 11:16:14.306961177 +0000 UTC m=+37.173485313 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/beb961a2-0877-4cd8-be68-062f895cef5d-registry-tls") pod "image-registry-799db9ddcb-l92ps" (UID: "beb961a2-0877-4cd8-be68-062f895cef5d") : secret "image-registry-tls" not found Apr 17 11:16:12.307241 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:12.307035 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-service-ca-bundle\") pod \"router-default-7657d8478-pp7qf\" (UID: \"615b2482-c02f-4752-9ea6-7e7cef5c1fe9\") " pod="openshift-ingress/router-default-7657d8478-pp7qf" Apr 17 11:16:12.307241 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:12.307087 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d90dfee2-676e-4224-8ec8-8d764b523802-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-xjjnj\" (UID: \"d90dfee2-676e-4224-8ec8-8d764b523802\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xjjnj" Apr 17 11:16:12.307241 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:12.307154 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-service-ca-bundle podName:615b2482-c02f-4752-9ea6-7e7cef5c1fe9 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:14.307144903 +0000 UTC m=+37.173669031 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-service-ca-bundle") pod "router-default-7657d8478-pp7qf" (UID: "615b2482-c02f-4752-9ea6-7e7cef5c1fe9") : configmap references non-existent config key: service-ca.crt Apr 17 11:16:12.307241 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:12.307188 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 11:16:12.307241 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:12.307214 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d90dfee2-676e-4224-8ec8-8d764b523802-networking-console-plugin-cert podName:d90dfee2-676e-4224-8ec8-8d764b523802 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:14.307206139 +0000 UTC m=+37.173730267 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d90dfee2-676e-4224-8ec8-8d764b523802-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-xjjnj" (UID: "d90dfee2-676e-4224-8ec8-8d764b523802") : secret "networking-console-plugin-cert" not found Apr 17 11:16:12.860714 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:12.860604 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-47nt5" event={"ID":"bfa20876-9d47-42bf-aad5-24503e05b86e","Type":"ContainerStarted","Data":"b1c76528b281809d3ac2350a7db155f894ee49e467898a8a3a734793f347970c"} Apr 17 11:16:12.866900 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:12.866864 2579 generic.go:358] "Generic (PLEG): container finished" podID="841bd702-cde2-4bf5-9789-aa664c501f8f" containerID="e9086059625486676a63504a9d4f764137f21f0ec13a0564bac03489ec2acac5" exitCode=0 Apr 17 11:16:12.867031 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:12.866997 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dvnsm" event={"ID":"841bd702-cde2-4bf5-9789-aa664c501f8f","Type":"ContainerDied","Data":"e9086059625486676a63504a9d4f764137f21f0ec13a0564bac03489ec2acac5"} Apr 17 11:16:12.869879 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:12.869820 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ds9gs" event={"ID":"44d4ecc4-97ed-4995-a7f7-5f731f3fe770","Type":"ContainerStarted","Data":"ad30e5c8f46acc4a6a2766fe5627aaedd6505766546c37a30fc5a4abb5b92871"} Apr 17 11:16:12.872168 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:12.871494 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dl54m" event={"ID":"92e7a74f-35b8-4f70-833b-1261a2bed50d","Type":"ContainerStarted","Data":"0bf9e5494269bcf5ba6dbe869d5f114715918f2dda4cfb4f84f6678e4510e7cc"} Apr 17 11:16:12.874670 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:12.874592 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6hppm" event={"ID":"b2c84c0f-0d60-4465-b1fd-4f39963e95d4","Type":"ContainerStarted","Data":"feee9591176a70ee650a7acd559cb081141f8245f702ece48ceb240e2121c32b"} Apr 17 11:16:12.877583 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:12.877537 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-fdqlp" event={"ID":"8a1529e3-e492-461e-9d34-440e5555b197","Type":"ContainerStarted","Data":"a00737a3d2685a43bb8b63b79d7eba09b5524fce74f438b89919d113dd07918d"} Apr 17 11:16:12.880837 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:12.880740 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-clmj2" event={"ID":"e2778730-467e-4432-9a1a-d5f871276f6d","Type":"ContainerStarted","Data":"84930979321b02db31a87b92081db0d756d82650c617f42ea4fc3769927f980e"} Apr 17 11:16:12.889543 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:12.889501 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-9wmp5" event={"ID":"89b9086c-55f6-4d0b-a998-d22a793d7d17","Type":"ContainerStarted","Data":"d01c7a7dcb5fd8da34588b308854ddf25f47334ef20e5f0b6fee8611e72505ae"} Apr 17 11:16:13.914510 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:13.913500 2579 generic.go:358] "Generic (PLEG): container finished" podID="841bd702-cde2-4bf5-9789-aa664c501f8f" containerID="92dc5f50e5d0d8b461e72658207a8310dfa71759a172bd46f1616f0af43f44a3" exitCode=0 Apr 17 11:16:13.914510 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:13.913585 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dvnsm" event={"ID":"841bd702-cde2-4bf5-9789-aa664c501f8f","Type":"ContainerDied","Data":"92dc5f50e5d0d8b461e72658207a8310dfa71759a172bd46f1616f0af43f44a3"} Apr 17 11:16:14.232724 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:14.231534 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97fe64f0-9f87-4b25-876e-a59829b69c04-registry-tls\") pod \"image-registry-5b9d896675-7lskv\" (UID: \"97fe64f0-9f87-4b25-876e-a59829b69c04\") " pod="openshift-image-registry/image-registry-5b9d896675-7lskv" Apr 17 11:16:14.232724 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:14.231855 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3e70c72-ac65-4a09-b59a-570bf07a6dbb-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7lqsd\" (UID: \"a3e70c72-ac65-4a09-b59a-570bf07a6dbb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7lqsd" Apr 17 11:16:14.232724 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:14.231985 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/cfa96c8c-1c5c-4749-a74a-b6eab2274afd-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-65j5q\" (UID: \"cfa96c8c-1c5c-4749-a74a-b6eab2274afd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-65j5q" Apr 17 11:16:14.232724 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:14.232119 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 11:16:14.232724 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:14.232177 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfa96c8c-1c5c-4749-a74a-b6eab2274afd-cluster-monitoring-operator-tls podName:cfa96c8c-1c5c-4749-a74a-b6eab2274afd nodeName:}" failed. No retries permitted until 2026-04-17 11:16:18.232159664 +0000 UTC m=+41.098683806 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/cfa96c8c-1c5c-4749-a74a-b6eab2274afd-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-65j5q" (UID: "cfa96c8c-1c5c-4749-a74a-b6eab2274afd") : secret "cluster-monitoring-operator-tls" not found Apr 17 11:16:14.232724 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:14.232562 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:16:14.232724 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:14.232574 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5b9d896675-7lskv: secret "image-registry-tls" not found Apr 17 11:16:14.232724 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:14.232612 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/97fe64f0-9f87-4b25-876e-a59829b69c04-registry-tls podName:97fe64f0-9f87-4b25-876e-a59829b69c04 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:18.232598082 +0000 UTC m=+41.099122217 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/97fe64f0-9f87-4b25-876e-a59829b69c04-registry-tls") pod "image-registry-5b9d896675-7lskv" (UID: "97fe64f0-9f87-4b25-876e-a59829b69c04") : secret "image-registry-tls" not found Apr 17 11:16:14.232724 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:14.232665 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 11:16:14.232724 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:14.232695 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3e70c72-ac65-4a09-b59a-570bf07a6dbb-samples-operator-tls podName:a3e70c72-ac65-4a09-b59a-570bf07a6dbb nodeName:}" failed. No retries permitted until 2026-04-17 11:16:18.23268589 +0000 UTC m=+41.099210019 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/a3e70c72-ac65-4a09-b59a-570bf07a6dbb-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-7lqsd" (UID: "a3e70c72-ac65-4a09-b59a-570bf07a6dbb") : secret "samples-operator-tls" not found Apr 17 11:16:14.333320 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:14.333277 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d90dfee2-676e-4224-8ec8-8d764b523802-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-xjjnj\" (UID: \"d90dfee2-676e-4224-8ec8-8d764b523802\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xjjnj" Apr 17 11:16:14.333499 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:14.333370 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4dd7876b-a6b7-4cf0-b645-979aead5bdff-metrics-tls\") pod \"dns-default-tkcmp\" (UID: \"4dd7876b-a6b7-4cf0-b645-979aead5bdff\") " pod="openshift-dns/dns-default-tkcmp" Apr 17 11:16:14.333499 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:14.333408 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-metrics-certs\") pod \"router-default-7657d8478-pp7qf\" (UID: \"615b2482-c02f-4752-9ea6-7e7cef5c1fe9\") " pod="openshift-ingress/router-default-7657d8478-pp7qf" Apr 17 11:16:14.333499 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:14.333455 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05a441fa-9d9b-40d1-adfd-ffe296dfb2d0-cert\") pod \"ingress-canary-sswv7\" (UID: \"05a441fa-9d9b-40d1-adfd-ffe296dfb2d0\") " pod="openshift-ingress-canary/ingress-canary-sswv7" Apr 17 11:16:14.333669 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:14.333505 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/beb961a2-0877-4cd8-be68-062f895cef5d-registry-tls\") pod \"image-registry-799db9ddcb-l92ps\" (UID: \"beb961a2-0877-4cd8-be68-062f895cef5d\") " pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" Apr 17 11:16:14.333669 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:14.333542 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 11:16:14.333669 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:14.333564 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-service-ca-bundle\") pod \"router-default-7657d8478-pp7qf\" (UID: \"615b2482-c02f-4752-9ea6-7e7cef5c1fe9\") " pod="openshift-ingress/router-default-7657d8478-pp7qf" Apr 17 11:16:14.333669 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:14.333627 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d90dfee2-676e-4224-8ec8-8d764b523802-networking-console-plugin-cert podName:d90dfee2-676e-4224-8ec8-8d764b523802 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:18.333602453 +0000 UTC m=+41.200126600 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d90dfee2-676e-4224-8ec8-8d764b523802-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-xjjnj" (UID: "d90dfee2-676e-4224-8ec8-8d764b523802") : secret "networking-console-plugin-cert" not found Apr 17 11:16:14.333893 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:14.333703 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-service-ca-bundle podName:615b2482-c02f-4752-9ea6-7e7cef5c1fe9 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:18.33368673 +0000 UTC m=+41.200210860 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-service-ca-bundle") pod "router-default-7657d8478-pp7qf" (UID: "615b2482-c02f-4752-9ea6-7e7cef5c1fe9") : configmap references non-existent config key: service-ca.crt Apr 17 11:16:14.333893 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:14.333783 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:14.333893 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:14.333815 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4dd7876b-a6b7-4cf0-b645-979aead5bdff-metrics-tls podName:4dd7876b-a6b7-4cf0-b645-979aead5bdff nodeName:}" failed. No retries permitted until 2026-04-17 11:16:18.333806542 +0000 UTC m=+41.200330670 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4dd7876b-a6b7-4cf0-b645-979aead5bdff-metrics-tls") pod "dns-default-tkcmp" (UID: "4dd7876b-a6b7-4cf0-b645-979aead5bdff") : secret "dns-default-metrics-tls" not found Apr 17 11:16:14.333893 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:14.333860 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 11:16:14.333893 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:14.333883 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-metrics-certs podName:615b2482-c02f-4752-9ea6-7e7cef5c1fe9 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:18.333875297 +0000 UTC m=+41.200399425 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-metrics-certs") pod "router-default-7657d8478-pp7qf" (UID: "615b2482-c02f-4752-9ea6-7e7cef5c1fe9") : secret "router-metrics-certs-default" not found Apr 17 11:16:14.334142 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:14.333916 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:14.334142 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:14.333933 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05a441fa-9d9b-40d1-adfd-ffe296dfb2d0-cert podName:05a441fa-9d9b-40d1-adfd-ffe296dfb2d0 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:18.333927344 +0000 UTC m=+41.200451471 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/05a441fa-9d9b-40d1-adfd-ffe296dfb2d0-cert") pod "ingress-canary-sswv7" (UID: "05a441fa-9d9b-40d1-adfd-ffe296dfb2d0") : secret "canary-serving-cert" not found Apr 17 11:16:14.334142 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:14.333977 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:16:14.334142 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:14.333984 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-799db9ddcb-l92ps: secret "image-registry-tls" not found Apr 17 11:16:14.334142 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:14.334003 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/beb961a2-0877-4cd8-be68-062f895cef5d-registry-tls podName:beb961a2-0877-4cd8-be68-062f895cef5d nodeName:}" failed. No retries permitted until 2026-04-17 11:16:18.333996772 +0000 UTC m=+41.200520899 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/beb961a2-0877-4cd8-be68-062f895cef5d-registry-tls") pod "image-registry-799db9ddcb-l92ps" (UID: "beb961a2-0877-4cd8-be68-062f895cef5d") : secret "image-registry-tls" not found Apr 17 11:16:14.921252 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:14.921071 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dvnsm" event={"ID":"841bd702-cde2-4bf5-9789-aa664c501f8f","Type":"ContainerStarted","Data":"0ef7691ccdf6a38bae70ad8d575a2bb542c2953d4ec09958e8a22db3a2759862"} Apr 17 11:16:14.964641 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:14.964448 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-dvnsm" podStartSLOduration=6.134566903 podStartE2EDuration="37.964428695s" podCreationTimestamp="2026-04-17 11:15:37 +0000 UTC" firstStartedPulling="2026-04-17 11:15:40.39536745 +0000 UTC m=+3.261891578" lastFinishedPulling="2026-04-17 11:16:12.225229243 +0000 UTC m=+35.091753370" observedRunningTime="2026-04-17 11:16:14.964053047 +0000 UTC m=+37.830577232" watchObservedRunningTime="2026-04-17 11:16:14.964428695 +0000 UTC m=+37.830952847" Apr 17 11:16:16.257020 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:16.256980 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-qv726"] Apr 17 11:16:16.263786 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:16.263742 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qv726" Apr 17 11:16:16.266650 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:16.266633 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 11:16:16.275636 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:16.275609 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-qv726"] Apr 17 11:16:16.455878 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:16.455838 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/65ec91b4-9666-4574-ad59-be0c3d01c971-kubelet-config\") pod \"global-pull-secret-syncer-qv726\" (UID: \"65ec91b4-9666-4574-ad59-be0c3d01c971\") " pod="kube-system/global-pull-secret-syncer-qv726" Apr 17 11:16:16.456081 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:16.456040 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/65ec91b4-9666-4574-ad59-be0c3d01c971-original-pull-secret\") pod \"global-pull-secret-syncer-qv726\" (UID: \"65ec91b4-9666-4574-ad59-be0c3d01c971\") " pod="kube-system/global-pull-secret-syncer-qv726" Apr 17 11:16:16.456216 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:16.456197 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/65ec91b4-9666-4574-ad59-be0c3d01c971-dbus\") pod \"global-pull-secret-syncer-qv726\" (UID: \"65ec91b4-9666-4574-ad59-be0c3d01c971\") " pod="kube-system/global-pull-secret-syncer-qv726" Apr 17 11:16:16.557787 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:16.557679 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/65ec91b4-9666-4574-ad59-be0c3d01c971-kubelet-config\") pod \"global-pull-secret-syncer-qv726\" (UID: \"65ec91b4-9666-4574-ad59-be0c3d01c971\") " pod="kube-system/global-pull-secret-syncer-qv726" Apr 17 11:16:16.558000 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:16.557831 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/65ec91b4-9666-4574-ad59-be0c3d01c971-kubelet-config\") pod \"global-pull-secret-syncer-qv726\" (UID: \"65ec91b4-9666-4574-ad59-be0c3d01c971\") " pod="kube-system/global-pull-secret-syncer-qv726" Apr 17 11:16:16.558000 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:16.557840 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/65ec91b4-9666-4574-ad59-be0c3d01c971-original-pull-secret\") pod \"global-pull-secret-syncer-qv726\" (UID: \"65ec91b4-9666-4574-ad59-be0c3d01c971\") " pod="kube-system/global-pull-secret-syncer-qv726" Apr 17 11:16:16.558000 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:16.557927 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/65ec91b4-9666-4574-ad59-be0c3d01c971-dbus\") pod \"global-pull-secret-syncer-qv726\" (UID: \"65ec91b4-9666-4574-ad59-be0c3d01c971\") " pod="kube-system/global-pull-secret-syncer-qv726" Apr 17 11:16:16.558197 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:16.558178 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/65ec91b4-9666-4574-ad59-be0c3d01c971-dbus\") pod \"global-pull-secret-syncer-qv726\" (UID: \"65ec91b4-9666-4574-ad59-be0c3d01c971\") " pod="kube-system/global-pull-secret-syncer-qv726" Apr 17 11:16:16.563013 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:16.562988 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/65ec91b4-9666-4574-ad59-be0c3d01c971-original-pull-secret\") pod \"global-pull-secret-syncer-qv726\" (UID: \"65ec91b4-9666-4574-ad59-be0c3d01c971\") " pod="kube-system/global-pull-secret-syncer-qv726" Apr 17 11:16:16.572856 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:16.572831 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qv726" Apr 17 11:16:18.273259 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:18.273214 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/cfa96c8c-1c5c-4749-a74a-b6eab2274afd-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-65j5q\" (UID: \"cfa96c8c-1c5c-4749-a74a-b6eab2274afd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-65j5q" Apr 17 11:16:18.273869 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:18.273301 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97fe64f0-9f87-4b25-876e-a59829b69c04-registry-tls\") pod \"image-registry-5b9d896675-7lskv\" (UID: \"97fe64f0-9f87-4b25-876e-a59829b69c04\") " pod="openshift-image-registry/image-registry-5b9d896675-7lskv" Apr 17 11:16:18.273869 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:18.273361 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3e70c72-ac65-4a09-b59a-570bf07a6dbb-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7lqsd\" (UID: \"a3e70c72-ac65-4a09-b59a-570bf07a6dbb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7lqsd" Apr 17 11:16:18.273869 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:18.273404 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 11:16:18.273869 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:18.273460 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 11:16:18.273869 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:18.273481 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:16:18.273869 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:18.273501 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfa96c8c-1c5c-4749-a74a-b6eab2274afd-cluster-monitoring-operator-tls podName:cfa96c8c-1c5c-4749-a74a-b6eab2274afd nodeName:}" failed. No retries permitted until 2026-04-17 11:16:26.273476992 +0000 UTC m=+49.140001120 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/cfa96c8c-1c5c-4749-a74a-b6eab2274afd-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-65j5q" (UID: "cfa96c8c-1c5c-4749-a74a-b6eab2274afd") : secret "cluster-monitoring-operator-tls" not found Apr 17 11:16:18.273869 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:18.273502 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5b9d896675-7lskv: secret "image-registry-tls" not found Apr 17 11:16:18.273869 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:18.273526 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3e70c72-ac65-4a09-b59a-570bf07a6dbb-samples-operator-tls podName:a3e70c72-ac65-4a09-b59a-570bf07a6dbb nodeName:}" failed. No retries permitted until 2026-04-17 11:16:26.273515129 +0000 UTC m=+49.140039257 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/a3e70c72-ac65-4a09-b59a-570bf07a6dbb-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-7lqsd" (UID: "a3e70c72-ac65-4a09-b59a-570bf07a6dbb") : secret "samples-operator-tls" not found Apr 17 11:16:18.273869 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:18.273580 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/97fe64f0-9f87-4b25-876e-a59829b69c04-registry-tls podName:97fe64f0-9f87-4b25-876e-a59829b69c04 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:26.27355974 +0000 UTC m=+49.140083876 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/97fe64f0-9f87-4b25-876e-a59829b69c04-registry-tls") pod "image-registry-5b9d896675-7lskv" (UID: "97fe64f0-9f87-4b25-876e-a59829b69c04") : secret "image-registry-tls" not found Apr 17 11:16:18.374946 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:18.374914 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-service-ca-bundle\") pod \"router-default-7657d8478-pp7qf\" (UID: \"615b2482-c02f-4752-9ea6-7e7cef5c1fe9\") " pod="openshift-ingress/router-default-7657d8478-pp7qf" Apr 17 11:16:18.375128 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:18.374972 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d90dfee2-676e-4224-8ec8-8d764b523802-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-xjjnj\" (UID: \"d90dfee2-676e-4224-8ec8-8d764b523802\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xjjnj" Apr 17 11:16:18.375128 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:18.375031 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4dd7876b-a6b7-4cf0-b645-979aead5bdff-metrics-tls\") pod \"dns-default-tkcmp\" (UID: \"4dd7876b-a6b7-4cf0-b645-979aead5bdff\") " pod="openshift-dns/dns-default-tkcmp" Apr 17 11:16:18.375128 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:18.375061 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-metrics-certs\") pod \"router-default-7657d8478-pp7qf\" (UID: \"615b2482-c02f-4752-9ea6-7e7cef5c1fe9\") " pod="openshift-ingress/router-default-7657d8478-pp7qf" Apr 17 11:16:18.375128 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:18.375098 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-service-ca-bundle podName:615b2482-c02f-4752-9ea6-7e7cef5c1fe9 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:26.375079157 +0000 UTC m=+49.241603288 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-service-ca-bundle") pod "router-default-7657d8478-pp7qf" (UID: "615b2482-c02f-4752-9ea6-7e7cef5c1fe9") : configmap references non-existent config key: service-ca.crt Apr 17 11:16:18.375348 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:18.375127 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 11:16:18.375348 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:18.375183 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d90dfee2-676e-4224-8ec8-8d764b523802-networking-console-plugin-cert podName:d90dfee2-676e-4224-8ec8-8d764b523802 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:26.375167268 +0000 UTC m=+49.241691399 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d90dfee2-676e-4224-8ec8-8d764b523802-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-xjjnj" (UID: "d90dfee2-676e-4224-8ec8-8d764b523802") : secret "networking-console-plugin-cert" not found Apr 17 11:16:18.375348 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:18.375134 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05a441fa-9d9b-40d1-adfd-ffe296dfb2d0-cert\") pod \"ingress-canary-sswv7\" (UID: \"05a441fa-9d9b-40d1-adfd-ffe296dfb2d0\") " pod="openshift-ingress-canary/ingress-canary-sswv7" Apr 17 11:16:18.375348 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:18.375207 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:18.375348 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:18.375213 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:18.375348 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:18.375254 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05a441fa-9d9b-40d1-adfd-ffe296dfb2d0-cert podName:05a441fa-9d9b-40d1-adfd-ffe296dfb2d0 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:26.375242262 +0000 UTC m=+49.241766393 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/05a441fa-9d9b-40d1-adfd-ffe296dfb2d0-cert") pod "ingress-canary-sswv7" (UID: "05a441fa-9d9b-40d1-adfd-ffe296dfb2d0") : secret "canary-serving-cert" not found Apr 17 11:16:18.375348 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:18.375253 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/beb961a2-0877-4cd8-be68-062f895cef5d-registry-tls\") pod \"image-registry-799db9ddcb-l92ps\" (UID: \"beb961a2-0877-4cd8-be68-062f895cef5d\") " pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" Apr 17 11:16:18.375348 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:18.375208 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 11:16:18.375348 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:18.375277 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4dd7876b-a6b7-4cf0-b645-979aead5bdff-metrics-tls podName:4dd7876b-a6b7-4cf0-b645-979aead5bdff nodeName:}" failed. No retries permitted until 2026-04-17 11:16:26.375260264 +0000 UTC m=+49.241784410 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4dd7876b-a6b7-4cf0-b645-979aead5bdff-metrics-tls") pod "dns-default-tkcmp" (UID: "4dd7876b-a6b7-4cf0-b645-979aead5bdff") : secret "dns-default-metrics-tls" not found Apr 17 11:16:18.375348 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:18.375302 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-metrics-certs podName:615b2482-c02f-4752-9ea6-7e7cef5c1fe9 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:26.375292502 +0000 UTC m=+49.241816637 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-metrics-certs") pod "router-default-7657d8478-pp7qf" (UID: "615b2482-c02f-4752-9ea6-7e7cef5c1fe9") : secret "router-metrics-certs-default" not found Apr 17 11:16:18.375348 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:18.375311 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:16:18.375348 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:18.375322 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-799db9ddcb-l92ps: secret "image-registry-tls" not found Apr 17 11:16:18.375729 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:18.375360 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/beb961a2-0877-4cd8-be68-062f895cef5d-registry-tls podName:beb961a2-0877-4cd8-be68-062f895cef5d nodeName:}" failed. No retries permitted until 2026-04-17 11:16:26.375350005 +0000 UTC m=+49.241874160 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/beb961a2-0877-4cd8-be68-062f895cef5d-registry-tls") pod "image-registry-799db9ddcb-l92ps" (UID: "beb961a2-0877-4cd8-be68-062f895cef5d") : secret "image-registry-tls" not found Apr 17 11:16:19.482638 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:19.482610 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-qv726"] Apr 17 11:16:19.485896 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:16:19.485865 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65ec91b4_9666_4574_ad59_be0c3d01c971.slice/crio-ddbca1a782140e5868753c3d256458434e31f6ef9fd7d91e1bebe790c296b6fb WatchSource:0}: Error finding container ddbca1a782140e5868753c3d256458434e31f6ef9fd7d91e1bebe790c296b6fb: Status 404 returned error can't find the container with id ddbca1a782140e5868753c3d256458434e31f6ef9fd7d91e1bebe790c296b6fb Apr 17 11:16:19.933588 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:19.933550 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-47nt5" event={"ID":"bfa20876-9d47-42bf-aad5-24503e05b86e","Type":"ContainerStarted","Data":"feb033d097df129a733d2a0d252004699520ada5611f976938d6f07b692ee269"} Apr 17 11:16:19.933808 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:19.933669 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-47nt5" Apr 17 11:16:19.935483 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:19.935114 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ds9gs" event={"ID":"44d4ecc4-97ed-4995-a7f7-5f731f3fe770","Type":"ContainerStarted","Data":"e933106ac24b6a4f0acbd9e78c511e7f4a936b508be81cd6ae14e5012263f6d0"} Apr 17 11:16:19.936855 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:19.936426 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dl54m" event={"ID":"92e7a74f-35b8-4f70-833b-1261a2bed50d","Type":"ContainerStarted","Data":"66572e2338f80d1e4d54b6b69cd15df9372f56b7f350355fae6db3b45aecfa08"} Apr 17 11:16:19.937739 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:19.937704 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6hppm" event={"ID":"b2c84c0f-0d60-4465-b1fd-4f39963e95d4","Type":"ContainerStarted","Data":"e8abba0ae748cf70ef095c45bd134dc8c42dc0475e7e81b63b03ae1ef3a87b81"} Apr 17 11:16:19.938710 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:19.938689 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-qv726" event={"ID":"65ec91b4-9666-4574-ad59-be0c3d01c971","Type":"ContainerStarted","Data":"ddbca1a782140e5868753c3d256458434e31f6ef9fd7d91e1bebe790c296b6fb"} Apr 17 11:16:19.940161 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:19.940124 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-fdqlp" event={"ID":"8a1529e3-e492-461e-9d34-440e5555b197","Type":"ContainerStarted","Data":"697cb3f9e05f4f52f3031e11f1a188391a74604232ff286e83a97e769bac7a69"} Apr 17 11:16:19.941451 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:19.941427 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-clmj2" event={"ID":"e2778730-467e-4432-9a1a-d5f871276f6d","Type":"ContainerStarted","Data":"21d1de5adf9de028226027355f32c884a7981de5710e43f398f0df629341a91b"} Apr 17 11:16:19.942956 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:19.942932 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9wmp5_89b9086c-55f6-4d0b-a998-d22a793d7d17/console-operator/0.log" Apr 17 11:16:19.943053 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:19.942975 2579 generic.go:358] "Generic (PLEG): container finished" podID="89b9086c-55f6-4d0b-a998-d22a793d7d17" containerID="2002fc9670b4eb994e14a5feb42528481ceceb7df1724b8c2b87c96f7b641515" exitCode=255 Apr 17 11:16:19.943053 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:19.943004 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-9wmp5" event={"ID":"89b9086c-55f6-4d0b-a998-d22a793d7d17","Type":"ContainerDied","Data":"2002fc9670b4eb994e14a5feb42528481ceceb7df1724b8c2b87c96f7b641515"} Apr 17 11:16:19.943189 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:19.943175 2579 scope.go:117] "RemoveContainer" containerID="2002fc9670b4eb994e14a5feb42528481ceceb7df1724b8c2b87c96f7b641515" Apr 17 11:16:19.948412 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:19.948367 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-47nt5" podStartSLOduration=35.659178157 podStartE2EDuration="42.948352074s" podCreationTimestamp="2026-04-17 11:15:37 +0000 UTC" firstStartedPulling="2026-04-17 11:16:12.172518814 +0000 UTC m=+35.039042942" lastFinishedPulling="2026-04-17 11:16:19.46169273 +0000 UTC m=+42.328216859" observedRunningTime="2026-04-17 11:16:19.947054236 +0000 UTC m=+42.813578386" watchObservedRunningTime="2026-04-17 11:16:19.948352074 +0000 UTC m=+42.814876226" Apr 17 11:16:19.964294 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:19.964025 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6hppm" podStartSLOduration=31.761964812 podStartE2EDuration="38.963996411s" podCreationTimestamp="2026-04-17 11:15:41 +0000 UTC" firstStartedPulling="2026-04-17 11:16:12.170955267 +0000 UTC m=+35.037479408" lastFinishedPulling="2026-04-17 11:16:19.372986872 +0000 UTC m=+42.239511007" observedRunningTime="2026-04-17 11:16:19.963343164 +0000 UTC m=+42.829867317" watchObservedRunningTime="2026-04-17 11:16:19.963996411 +0000 UTC m=+42.830520562" Apr 17 11:16:19.999890 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:19.997617 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dl54m" podStartSLOduration=31.782516897 podStartE2EDuration="38.997597192s" podCreationTimestamp="2026-04-17 11:15:41 +0000 UTC" firstStartedPulling="2026-04-17 11:16:12.201594155 +0000 UTC m=+35.068118287" lastFinishedPulling="2026-04-17 11:16:19.416674453 +0000 UTC m=+42.283198582" observedRunningTime="2026-04-17 11:16:19.978890339 +0000 UTC m=+42.845414492" watchObservedRunningTime="2026-04-17 11:16:19.997597192 +0000 UTC m=+42.864121343" Apr 17 11:16:19.999890 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:19.998248 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-clmj2" podStartSLOduration=31.794674958 podStartE2EDuration="38.998239661s" podCreationTimestamp="2026-04-17 11:15:41 +0000 UTC" firstStartedPulling="2026-04-17 11:16:12.169416071 +0000 UTC m=+35.035940209" lastFinishedPulling="2026-04-17 11:16:19.37298077 +0000 UTC m=+42.239504912" observedRunningTime="2026-04-17 11:16:19.995924478 +0000 UTC m=+42.862448629" watchObservedRunningTime="2026-04-17 11:16:19.998239661 +0000 UTC m=+42.864763812" Apr 17 11:16:20.010754 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:20.010589 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-fdqlp" podStartSLOduration=31.732890305 podStartE2EDuration="39.010575047s" podCreationTimestamp="2026-04-17 11:15:41 +0000 UTC" firstStartedPulling="2026-04-17 11:16:12.169153944 +0000 UTC m=+35.035678088" lastFinishedPulling="2026-04-17 11:16:19.446838703 +0000 UTC m=+42.313362830" observedRunningTime="2026-04-17 11:16:20.009589327 +0000 UTC m=+42.876113477" watchObservedRunningTime="2026-04-17 11:16:20.010575047 +0000 UTC m=+42.877099199" Apr 17 11:16:20.025830 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:20.025706 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ds9gs" podStartSLOduration=31.781089847 podStartE2EDuration="39.025690184s" podCreationTimestamp="2026-04-17 11:15:41 +0000 UTC" firstStartedPulling="2026-04-17 11:16:12.171982946 +0000 UTC m=+35.038507078" lastFinishedPulling="2026-04-17 11:16:19.416583285 +0000 UTC m=+42.283107415" observedRunningTime="2026-04-17 11:16:20.025217103 +0000 UTC m=+42.891741253" watchObservedRunningTime="2026-04-17 11:16:20.025690184 +0000 UTC m=+42.892214336" Apr 17 11:16:20.905831 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:20.905795 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-9wmp5" Apr 17 11:16:20.905831 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:20.905838 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-9wmp5" Apr 17 11:16:20.948205 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:20.948173 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9wmp5_89b9086c-55f6-4d0b-a998-d22a793d7d17/console-operator/1.log" Apr 17 11:16:20.948625 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:20.948605 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9wmp5_89b9086c-55f6-4d0b-a998-d22a793d7d17/console-operator/0.log" Apr 17 11:16:20.948736 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:20.948642 2579 generic.go:358] "Generic (PLEG): container finished" podID="89b9086c-55f6-4d0b-a998-d22a793d7d17" containerID="46e112ae8eedeb5bcdfddec800a624843f1dc2a0c24084b3a6af1884704d3e48" exitCode=255 Apr 17 11:16:20.950070 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:20.950043 2579 scope.go:117] "RemoveContainer" containerID="46e112ae8eedeb5bcdfddec800a624843f1dc2a0c24084b3a6af1884704d3e48" Apr 17 11:16:20.950229 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:20.950208 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-9wmp5_openshift-console-operator(89b9086c-55f6-4d0b-a998-d22a793d7d17)\"" pod="openshift-console-operator/console-operator-9d4b6777b-9wmp5" podUID="89b9086c-55f6-4d0b-a998-d22a793d7d17" Apr 17 11:16:20.950441 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:20.950421 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-9wmp5" event={"ID":"89b9086c-55f6-4d0b-a998-d22a793d7d17","Type":"ContainerDied","Data":"46e112ae8eedeb5bcdfddec800a624843f1dc2a0c24084b3a6af1884704d3e48"} Apr 17 11:16:20.950536 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:20.950459 2579 scope.go:117] "RemoveContainer" containerID="2002fc9670b4eb994e14a5feb42528481ceceb7df1724b8c2b87c96f7b641515" Apr 17 11:16:21.841678 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:21.841646 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-w48tf"] Apr 17 11:16:21.875250 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:21.875213 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-w48tf"] Apr 17 11:16:21.875398 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:21.875359 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w48tf" Apr 17 11:16:21.877172 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:21.877147 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 17 11:16:21.877318 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:21.877147 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 17 11:16:21.877476 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:21.877457 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-k5gww\"" Apr 17 11:16:21.909401 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:21.909371 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg2ps\" (UniqueName: \"kubernetes.io/projected/55842e83-3d9a-4294-a446-3bfe192d7a19-kube-api-access-tg2ps\") pod \"migrator-74bb7799d9-w48tf\" (UID: \"55842e83-3d9a-4294-a446-3bfe192d7a19\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w48tf" Apr 17 11:16:21.953169 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:21.953140 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9wmp5_89b9086c-55f6-4d0b-a998-d22a793d7d17/console-operator/1.log" Apr 17 11:16:21.953659 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:21.953638 2579 scope.go:117] "RemoveContainer" containerID="46e112ae8eedeb5bcdfddec800a624843f1dc2a0c24084b3a6af1884704d3e48" Apr 17 11:16:21.953898 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:21.953868 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-9wmp5_openshift-console-operator(89b9086c-55f6-4d0b-a998-d22a793d7d17)\"" pod="openshift-console-operator/console-operator-9d4b6777b-9wmp5" podUID="89b9086c-55f6-4d0b-a998-d22a793d7d17" Apr 17 11:16:22.010376 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:22.010335 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tg2ps\" (UniqueName: \"kubernetes.io/projected/55842e83-3d9a-4294-a446-3bfe192d7a19-kube-api-access-tg2ps\") pod \"migrator-74bb7799d9-w48tf\" (UID: \"55842e83-3d9a-4294-a446-3bfe192d7a19\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w48tf" Apr 17 11:16:22.017853 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:22.017831 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg2ps\" (UniqueName: \"kubernetes.io/projected/55842e83-3d9a-4294-a446-3bfe192d7a19-kube-api-access-tg2ps\") pod \"migrator-74bb7799d9-w48tf\" (UID: \"55842e83-3d9a-4294-a446-3bfe192d7a19\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w48tf" Apr 17 11:16:22.186926 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:22.186888 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w48tf" Apr 17 11:16:22.319850 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:22.319605 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-w48tf"] Apr 17 11:16:22.322600 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:16:22.322568 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55842e83_3d9a_4294_a446_3bfe192d7a19.slice/crio-fd8a14466bd28dce5073309b8fedd0cbb218f7a6bbb9ac2820eb1bcb05b51410 WatchSource:0}: Error finding container fd8a14466bd28dce5073309b8fedd0cbb218f7a6bbb9ac2820eb1bcb05b51410: Status 404 returned error can't find the container with id fd8a14466bd28dce5073309b8fedd0cbb218f7a6bbb9ac2820eb1bcb05b51410 Apr 17 11:16:22.872747 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:22.872712 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-jrxcq_7920eeb8-72c9-4fe5-aff5-30f78ed7f840/dns-node-resolver/0.log" Apr 17 11:16:22.958820 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:22.958782 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w48tf" event={"ID":"55842e83-3d9a-4294-a446-3bfe192d7a19","Type":"ContainerStarted","Data":"fd8a14466bd28dce5073309b8fedd0cbb218f7a6bbb9ac2820eb1bcb05b51410"} Apr 17 11:16:22.959216 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:22.959141 2579 scope.go:117] "RemoveContainer" containerID="46e112ae8eedeb5bcdfddec800a624843f1dc2a0c24084b3a6af1884704d3e48" Apr 17 11:16:22.959376 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:22.959352 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-9wmp5_openshift-console-operator(89b9086c-55f6-4d0b-a998-d22a793d7d17)\"" pod="openshift-console-operator/console-operator-9d4b6777b-9wmp5" podUID="89b9086c-55f6-4d0b-a998-d22a793d7d17" Apr 17 11:16:23.472697 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:23.472663 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2678k_0a2631b2-add8-43b1-a9b4-b872018c7373/node-ca/0.log" Apr 17 11:16:24.048291 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:24.048260 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-s8x9c"] Apr 17 11:16:24.051350 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:24.051331 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-s8x9c" Apr 17 11:16:24.053842 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:24.053354 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 17 11:16:24.053842 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:24.053468 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 17 11:16:24.053842 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:24.053472 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 17 11:16:24.053842 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:24.053526 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 17 11:16:24.054097 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:24.053929 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-mv8tt\"" Apr 17 11:16:24.061732 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:24.061708 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-s8x9c"] Apr 17 11:16:24.133586 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:24.133552 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbgxp\" (UniqueName: \"kubernetes.io/projected/0a6c109d-8efa-4a59-8cd5-200458c2247c-kube-api-access-kbgxp\") pod \"service-ca-865cb79987-s8x9c\" (UID: \"0a6c109d-8efa-4a59-8cd5-200458c2247c\") " pod="openshift-service-ca/service-ca-865cb79987-s8x9c" Apr 17 11:16:24.133789 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:24.133645 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0a6c109d-8efa-4a59-8cd5-200458c2247c-signing-cabundle\") pod \"service-ca-865cb79987-s8x9c\" (UID: \"0a6c109d-8efa-4a59-8cd5-200458c2247c\") " pod="openshift-service-ca/service-ca-865cb79987-s8x9c" Apr 17 11:16:24.133789 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:24.133781 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0a6c109d-8efa-4a59-8cd5-200458c2247c-signing-key\") pod \"service-ca-865cb79987-s8x9c\" (UID: \"0a6c109d-8efa-4a59-8cd5-200458c2247c\") " pod="openshift-service-ca/service-ca-865cb79987-s8x9c" Apr 17 11:16:24.234825 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:24.234782 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0a6c109d-8efa-4a59-8cd5-200458c2247c-signing-key\") pod \"service-ca-865cb79987-s8x9c\" (UID: \"0a6c109d-8efa-4a59-8cd5-200458c2247c\") " pod="openshift-service-ca/service-ca-865cb79987-s8x9c" Apr 17 11:16:24.234975 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:24.234911 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbgxp\" (UniqueName: \"kubernetes.io/projected/0a6c109d-8efa-4a59-8cd5-200458c2247c-kube-api-access-kbgxp\") pod \"service-ca-865cb79987-s8x9c\" (UID: \"0a6c109d-8efa-4a59-8cd5-200458c2247c\") " pod="openshift-service-ca/service-ca-865cb79987-s8x9c" Apr 17 11:16:24.235016 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:24.234972 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0a6c109d-8efa-4a59-8cd5-200458c2247c-signing-cabundle\") pod \"service-ca-865cb79987-s8x9c\" (UID: \"0a6c109d-8efa-4a59-8cd5-200458c2247c\") " pod="openshift-service-ca/service-ca-865cb79987-s8x9c" Apr 17 11:16:24.235717 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:24.235691 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0a6c109d-8efa-4a59-8cd5-200458c2247c-signing-cabundle\") pod \"service-ca-865cb79987-s8x9c\" (UID: \"0a6c109d-8efa-4a59-8cd5-200458c2247c\") " pod="openshift-service-ca/service-ca-865cb79987-s8x9c" Apr 17 11:16:24.237453 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:24.237428 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0a6c109d-8efa-4a59-8cd5-200458c2247c-signing-key\") pod \"service-ca-865cb79987-s8x9c\" (UID: \"0a6c109d-8efa-4a59-8cd5-200458c2247c\") " pod="openshift-service-ca/service-ca-865cb79987-s8x9c" Apr 17 11:16:24.243231 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:24.243205 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbgxp\" (UniqueName: \"kubernetes.io/projected/0a6c109d-8efa-4a59-8cd5-200458c2247c-kube-api-access-kbgxp\") pod \"service-ca-865cb79987-s8x9c\" (UID: \"0a6c109d-8efa-4a59-8cd5-200458c2247c\") " pod="openshift-service-ca/service-ca-865cb79987-s8x9c" Apr 17 11:16:24.363671 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:24.363590 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-s8x9c" Apr 17 11:16:24.620170 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:24.619740 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-s8x9c"] Apr 17 11:16:24.627051 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:16:24.627020 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a6c109d_8efa_4a59_8cd5_200458c2247c.slice/crio-10494b4bcaa49f0a86ed5f49755c2967a31d2f39fd75fdc91bc24ca1626f5fc3 WatchSource:0}: Error finding container 10494b4bcaa49f0a86ed5f49755c2967a31d2f39fd75fdc91bc24ca1626f5fc3: Status 404 returned error can't find the container with id 10494b4bcaa49f0a86ed5f49755c2967a31d2f39fd75fdc91bc24ca1626f5fc3 Apr 17 11:16:24.966576 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:24.966542 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-qv726" event={"ID":"65ec91b4-9666-4574-ad59-be0c3d01c971","Type":"ContainerStarted","Data":"3ca6c2aef73c1cebdd6e7d9da5986e7a3d73843f386b9db1201754647abdeafd"} Apr 17 11:16:24.968184 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:24.968150 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w48tf" event={"ID":"55842e83-3d9a-4294-a446-3bfe192d7a19","Type":"ContainerStarted","Data":"a9bdfac6c58ee81ade23128cfb0a484742603dffe0ac8798685ec517a4ceca4f"} Apr 17 11:16:24.968184 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:24.968183 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w48tf" event={"ID":"55842e83-3d9a-4294-a446-3bfe192d7a19","Type":"ContainerStarted","Data":"74c95b3037602e900f56cd2f2fd401dda509a288afe113facf43894bc4200e54"} Apr 17 11:16:24.969415 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:24.969390 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-s8x9c" event={"ID":"0a6c109d-8efa-4a59-8cd5-200458c2247c","Type":"ContainerStarted","Data":"663ad1e27882b276836e6d968caf7399bc722f9f7d83ebedf719d90be22493c2"} Apr 17 11:16:24.969552 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:24.969419 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-s8x9c" event={"ID":"0a6c109d-8efa-4a59-8cd5-200458c2247c","Type":"ContainerStarted","Data":"10494b4bcaa49f0a86ed5f49755c2967a31d2f39fd75fdc91bc24ca1626f5fc3"} Apr 17 11:16:24.983756 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:24.983704 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-qv726" podStartSLOduration=4.005625847 podStartE2EDuration="8.983685903s" podCreationTimestamp="2026-04-17 11:16:16 +0000 UTC" firstStartedPulling="2026-04-17 11:16:19.487494529 +0000 UTC m=+42.354018658" lastFinishedPulling="2026-04-17 11:16:24.465554587 +0000 UTC m=+47.332078714" observedRunningTime="2026-04-17 11:16:24.982996617 +0000 UTC m=+47.849520768" watchObservedRunningTime="2026-04-17 11:16:24.983685903 +0000 UTC m=+47.850210054" Apr 17 11:16:24.999782 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:24.999718 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-s8x9c" podStartSLOduration=0.999704531 podStartE2EDuration="999.704531ms" podCreationTimestamp="2026-04-17 11:16:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:16:24.998319511 +0000 UTC m=+47.864843660" watchObservedRunningTime="2026-04-17 11:16:24.999704531 +0000 UTC m=+47.866228681" Apr 17 11:16:25.014735 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:25.014687 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w48tf" podStartSLOduration=1.8770114549999999 podStartE2EDuration="4.014671252s" podCreationTimestamp="2026-04-17 11:16:21 +0000 UTC" firstStartedPulling="2026-04-17 11:16:22.324663681 +0000 UTC m=+45.191187813" lastFinishedPulling="2026-04-17 11:16:24.462323467 +0000 UTC m=+47.328847610" observedRunningTime="2026-04-17 11:16:25.014032227 +0000 UTC m=+47.880556377" watchObservedRunningTime="2026-04-17 11:16:25.014671252 +0000 UTC m=+47.881195405" Apr 17 11:16:26.354716 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:26.353910 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3e70c72-ac65-4a09-b59a-570bf07a6dbb-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7lqsd\" (UID: \"a3e70c72-ac65-4a09-b59a-570bf07a6dbb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7lqsd" Apr 17 11:16:26.354716 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:26.354058 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/cfa96c8c-1c5c-4749-a74a-b6eab2274afd-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-65j5q\" (UID: \"cfa96c8c-1c5c-4749-a74a-b6eab2274afd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-65j5q" Apr 17 11:16:26.354716 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:26.354089 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 11:16:26.354716 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:26.354121 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97fe64f0-9f87-4b25-876e-a59829b69c04-registry-tls\") pod \"image-registry-5b9d896675-7lskv\" (UID: \"97fe64f0-9f87-4b25-876e-a59829b69c04\") " pod="openshift-image-registry/image-registry-5b9d896675-7lskv" Apr 17 11:16:26.354716 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:26.354257 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 11:16:26.354716 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:26.354264 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:16:26.354716 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:26.354279 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5b9d896675-7lskv: secret "image-registry-tls" not found Apr 17 11:16:26.354716 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:26.354339 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3e70c72-ac65-4a09-b59a-570bf07a6dbb-samples-operator-tls podName:a3e70c72-ac65-4a09-b59a-570bf07a6dbb nodeName:}" failed. No retries permitted until 2026-04-17 11:16:42.354299521 +0000 UTC m=+65.220823652 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/a3e70c72-ac65-4a09-b59a-570bf07a6dbb-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-7lqsd" (UID: "a3e70c72-ac65-4a09-b59a-570bf07a6dbb") : secret "samples-operator-tls" not found Apr 17 11:16:26.354716 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:26.354358 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/97fe64f0-9f87-4b25-876e-a59829b69c04-registry-tls podName:97fe64f0-9f87-4b25-876e-a59829b69c04 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:42.354348429 +0000 UTC m=+65.220872563 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/97fe64f0-9f87-4b25-876e-a59829b69c04-registry-tls") pod "image-registry-5b9d896675-7lskv" (UID: "97fe64f0-9f87-4b25-876e-a59829b69c04") : secret "image-registry-tls" not found Apr 17 11:16:26.354716 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:26.354374 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfa96c8c-1c5c-4749-a74a-b6eab2274afd-cluster-monitoring-operator-tls podName:cfa96c8c-1c5c-4749-a74a-b6eab2274afd nodeName:}" failed. No retries permitted until 2026-04-17 11:16:42.35436553 +0000 UTC m=+65.220889663 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/cfa96c8c-1c5c-4749-a74a-b6eab2274afd-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-65j5q" (UID: "cfa96c8c-1c5c-4749-a74a-b6eab2274afd") : secret "cluster-monitoring-operator-tls" not found Apr 17 11:16:26.455754 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:26.455596 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/beb961a2-0877-4cd8-be68-062f895cef5d-registry-tls\") pod \"image-registry-799db9ddcb-l92ps\" (UID: \"beb961a2-0877-4cd8-be68-062f895cef5d\") " pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" Apr 17 11:16:26.455754 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:26.455655 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-service-ca-bundle\") pod \"router-default-7657d8478-pp7qf\" (UID: \"615b2482-c02f-4752-9ea6-7e7cef5c1fe9\") " pod="openshift-ingress/router-default-7657d8478-pp7qf" Apr 17 11:16:26.455754 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:26.455683 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d90dfee2-676e-4224-8ec8-8d764b523802-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-xjjnj\" (UID: \"d90dfee2-676e-4224-8ec8-8d764b523802\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xjjnj" Apr 17 11:16:26.456051 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:26.455858 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 11:16:26.456051 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:26.455929 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d90dfee2-676e-4224-8ec8-8d764b523802-networking-console-plugin-cert podName:d90dfee2-676e-4224-8ec8-8d764b523802 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:42.455904753 +0000 UTC m=+65.322428882 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d90dfee2-676e-4224-8ec8-8d764b523802-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-xjjnj" (UID: "d90dfee2-676e-4224-8ec8-8d764b523802") : secret "networking-console-plugin-cert" not found Apr 17 11:16:26.456051 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:26.456018 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:16:26.456051 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:26.456032 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-799db9ddcb-l92ps: secret "image-registry-tls" not found Apr 17 11:16:26.456233 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:26.456070 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/beb961a2-0877-4cd8-be68-062f895cef5d-registry-tls podName:beb961a2-0877-4cd8-be68-062f895cef5d nodeName:}" failed. No retries permitted until 2026-04-17 11:16:42.45605733 +0000 UTC m=+65.322581459 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/beb961a2-0877-4cd8-be68-062f895cef5d-registry-tls") pod "image-registry-799db9ddcb-l92ps" (UID: "beb961a2-0877-4cd8-be68-062f895cef5d") : secret "image-registry-tls" not found Apr 17 11:16:26.456233 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:26.456151 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-service-ca-bundle podName:615b2482-c02f-4752-9ea6-7e7cef5c1fe9 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:42.45613888 +0000 UTC m=+65.322663010 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-service-ca-bundle") pod "router-default-7657d8478-pp7qf" (UID: "615b2482-c02f-4752-9ea6-7e7cef5c1fe9") : configmap references non-existent config key: service-ca.crt Apr 17 11:16:26.456532 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:26.456509 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4dd7876b-a6b7-4cf0-b645-979aead5bdff-metrics-tls\") pod \"dns-default-tkcmp\" (UID: \"4dd7876b-a6b7-4cf0-b645-979aead5bdff\") " pod="openshift-dns/dns-default-tkcmp" Apr 17 11:16:26.456600 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:26.456565 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-metrics-certs\") pod \"router-default-7657d8478-pp7qf\" (UID: \"615b2482-c02f-4752-9ea6-7e7cef5c1fe9\") " pod="openshift-ingress/router-default-7657d8478-pp7qf" Apr 17 11:16:26.456648 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:26.456612 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05a441fa-9d9b-40d1-adfd-ffe296dfb2d0-cert\") pod \"ingress-canary-sswv7\" (UID: \"05a441fa-9d9b-40d1-adfd-ffe296dfb2d0\") " pod="openshift-ingress-canary/ingress-canary-sswv7" Apr 17 11:16:26.456753 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:26.456739 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:26.456821 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:26.456802 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05a441fa-9d9b-40d1-adfd-ffe296dfb2d0-cert podName:05a441fa-9d9b-40d1-adfd-ffe296dfb2d0 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:42.45678831 +0000 UTC m=+65.323312454 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/05a441fa-9d9b-40d1-adfd-ffe296dfb2d0-cert") pod "ingress-canary-sswv7" (UID: "05a441fa-9d9b-40d1-adfd-ffe296dfb2d0") : secret "canary-serving-cert" not found Apr 17 11:16:26.456977 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:26.456927 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:26.456977 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:26.456973 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4dd7876b-a6b7-4cf0-b645-979aead5bdff-metrics-tls podName:4dd7876b-a6b7-4cf0-b645-979aead5bdff nodeName:}" failed. No retries permitted until 2026-04-17 11:16:42.456960714 +0000 UTC m=+65.323484856 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4dd7876b-a6b7-4cf0-b645-979aead5bdff-metrics-tls") pod "dns-default-tkcmp" (UID: "4dd7876b-a6b7-4cf0-b645-979aead5bdff") : secret "dns-default-metrics-tls" not found Apr 17 11:16:26.457097 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:26.457015 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 11:16:26.457097 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:26.457052 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-metrics-certs podName:615b2482-c02f-4752-9ea6-7e7cef5c1fe9 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:42.457040113 +0000 UTC m=+65.323564242 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-metrics-certs") pod "router-default-7657d8478-pp7qf" (UID: "615b2482-c02f-4752-9ea6-7e7cef5c1fe9") : secret "router-metrics-certs-default" not found Apr 17 11:16:30.905740 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:30.905697 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-9wmp5" Apr 17 11:16:30.905740 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:30.905731 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-9wmp5" Apr 17 11:16:30.906301 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:30.906105 2579 scope.go:117] "RemoveContainer" containerID="46e112ae8eedeb5bcdfddec800a624843f1dc2a0c24084b3a6af1884704d3e48" Apr 17 11:16:31.996817 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:31.996785 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9wmp5_89b9086c-55f6-4d0b-a998-d22a793d7d17/console-operator/1.log" Apr 17 11:16:31.997195 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:31.996882 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-9wmp5" event={"ID":"89b9086c-55f6-4d0b-a998-d22a793d7d17","Type":"ContainerStarted","Data":"36cb4dfe6c42fd894b0748bb0dbbe7fbfbbd057383030d63a4d4c8b9063aeb9e"} Apr 17 11:16:31.997195 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:31.997178 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-9wmp5" Apr 17 11:16:32.003091 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:32.003069 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-9wmp5" Apr 17 11:16:32.019174 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:32.019135 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-9wmp5" podStartSLOduration=43.847631557 podStartE2EDuration="51.019122654s" podCreationTimestamp="2026-04-17 11:15:41 +0000 UTC" firstStartedPulling="2026-04-17 11:16:12.201494172 +0000 UTC m=+35.068018301" lastFinishedPulling="2026-04-17 11:16:19.372985271 +0000 UTC m=+42.239509398" observedRunningTime="2026-04-17 11:16:32.017670093 +0000 UTC m=+54.884194242" watchObservedRunningTime="2026-04-17 11:16:32.019122654 +0000 UTC m=+54.885646803" Apr 17 11:16:34.856865 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:34.856833 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lxcn4" Apr 17 11:16:42.406999 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.406956 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/cfa96c8c-1c5c-4749-a74a-b6eab2274afd-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-65j5q\" (UID: \"cfa96c8c-1c5c-4749-a74a-b6eab2274afd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-65j5q" Apr 17 11:16:42.407493 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.407019 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97fe64f0-9f87-4b25-876e-a59829b69c04-registry-tls\") pod \"image-registry-5b9d896675-7lskv\" (UID: \"97fe64f0-9f87-4b25-876e-a59829b69c04\") " pod="openshift-image-registry/image-registry-5b9d896675-7lskv" Apr 17 11:16:42.407493 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.407061 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3e70c72-ac65-4a09-b59a-570bf07a6dbb-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7lqsd\" (UID: \"a3e70c72-ac65-4a09-b59a-570bf07a6dbb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7lqsd" Apr 17 11:16:42.409484 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.409456 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97fe64f0-9f87-4b25-876e-a59829b69c04-registry-tls\") pod \"image-registry-5b9d896675-7lskv\" (UID: \"97fe64f0-9f87-4b25-876e-a59829b69c04\") " pod="openshift-image-registry/image-registry-5b9d896675-7lskv" Apr 17 11:16:42.409594 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.409463 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3e70c72-ac65-4a09-b59a-570bf07a6dbb-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7lqsd\" (UID: \"a3e70c72-ac65-4a09-b59a-570bf07a6dbb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7lqsd" Apr 17 11:16:42.409636 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.409590 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/cfa96c8c-1c5c-4749-a74a-b6eab2274afd-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-65j5q\" (UID: \"cfa96c8c-1c5c-4749-a74a-b6eab2274afd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-65j5q" Apr 17 11:16:42.507662 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.507622 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-metrics-certs\") pod \"router-default-7657d8478-pp7qf\" (UID: \"615b2482-c02f-4752-9ea6-7e7cef5c1fe9\") " pod="openshift-ingress/router-default-7657d8478-pp7qf" Apr 17 11:16:42.507848 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.507676 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05a441fa-9d9b-40d1-adfd-ffe296dfb2d0-cert\") pod \"ingress-canary-sswv7\" (UID: \"05a441fa-9d9b-40d1-adfd-ffe296dfb2d0\") " pod="openshift-ingress-canary/ingress-canary-sswv7" Apr 17 11:16:42.507848 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.507708 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/beb961a2-0877-4cd8-be68-062f895cef5d-registry-tls\") pod \"image-registry-799db9ddcb-l92ps\" (UID: \"beb961a2-0877-4cd8-be68-062f895cef5d\") " pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" Apr 17 11:16:42.507848 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.507739 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-service-ca-bundle\") pod \"router-default-7657d8478-pp7qf\" (UID: \"615b2482-c02f-4752-9ea6-7e7cef5c1fe9\") " pod="openshift-ingress/router-default-7657d8478-pp7qf" Apr 17 11:16:42.507848 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.507759 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d90dfee2-676e-4224-8ec8-8d764b523802-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-xjjnj\" (UID: \"d90dfee2-676e-4224-8ec8-8d764b523802\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xjjnj" Apr 17 11:16:42.508051 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.507926 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4dd7876b-a6b7-4cf0-b645-979aead5bdff-metrics-tls\") pod \"dns-default-tkcmp\" (UID: \"4dd7876b-a6b7-4cf0-b645-979aead5bdff\") " pod="openshift-dns/dns-default-tkcmp" Apr 17 11:16:42.508512 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.508483 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-service-ca-bundle\") pod \"router-default-7657d8478-pp7qf\" (UID: \"615b2482-c02f-4752-9ea6-7e7cef5c1fe9\") " pod="openshift-ingress/router-default-7657d8478-pp7qf" Apr 17 11:16:42.510221 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.510193 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4dd7876b-a6b7-4cf0-b645-979aead5bdff-metrics-tls\") pod \"dns-default-tkcmp\" (UID: \"4dd7876b-a6b7-4cf0-b645-979aead5bdff\") " pod="openshift-dns/dns-default-tkcmp" Apr 17 11:16:42.510336 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.510250 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05a441fa-9d9b-40d1-adfd-ffe296dfb2d0-cert\") pod \"ingress-canary-sswv7\" (UID: \"05a441fa-9d9b-40d1-adfd-ffe296dfb2d0\") " pod="openshift-ingress-canary/ingress-canary-sswv7" Apr 17 11:16:42.510336 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.510256 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/615b2482-c02f-4752-9ea6-7e7cef5c1fe9-metrics-certs\") pod \"router-default-7657d8478-pp7qf\" (UID: \"615b2482-c02f-4752-9ea6-7e7cef5c1fe9\") " pod="openshift-ingress/router-default-7657d8478-pp7qf" Apr 17 11:16:42.510653 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.510632 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/beb961a2-0877-4cd8-be68-062f895cef5d-registry-tls\") pod \"image-registry-799db9ddcb-l92ps\" (UID: \"beb961a2-0877-4cd8-be68-062f895cef5d\") " pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" Apr 17 11:16:42.510705 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.510663 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d90dfee2-676e-4224-8ec8-8d764b523802-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-xjjnj\" (UID: \"d90dfee2-676e-4224-8ec8-8d764b523802\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xjjnj" Apr 17 11:16:42.605177 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.605145 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-bqhrd\"" Apr 17 11:16:42.614060 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.614028 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7lqsd" Apr 17 11:16:42.626121 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.626093 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-7fq9z\"" Apr 17 11:16:42.634613 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.634566 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5b9d896675-7lskv" Apr 17 11:16:42.648396 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.648367 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-5vlf2\"" Apr 17 11:16:42.656915 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.656880 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-65j5q" Apr 17 11:16:42.671844 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.671811 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" Apr 17 11:16:42.682756 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.682524 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-pmnxw\"" Apr 17 11:16:42.691540 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.690707 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-xjjnj" Apr 17 11:16:42.694083 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.693097 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-swgpj\"" Apr 17 11:16:42.700431 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.700400 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7657d8478-pp7qf" Apr 17 11:16:42.713352 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.713326 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-lbntp\"" Apr 17 11:16:42.721163 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.719175 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-f8bg7\"" Apr 17 11:16:42.721163 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.720489 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sswv7" Apr 17 11:16:42.728440 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.727922 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tkcmp" Apr 17 11:16:42.775327 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.772114 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7lqsd"] Apr 17 11:16:42.839022 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.838839 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5b9d896675-7lskv"] Apr 17 11:16:42.851740 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:16:42.851428 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97fe64f0_9f87_4b25_876e_a59829b69c04.slice/crio-5501beda713b23d94077ac80ee75c293d1122a812513488698c10c01e0ede782 WatchSource:0}: Error finding container 5501beda713b23d94077ac80ee75c293d1122a812513488698c10c01e0ede782: Status 404 returned error can't find the container with id 5501beda713b23d94077ac80ee75c293d1122a812513488698c10c01e0ede782 Apr 17 11:16:42.874368 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.874305 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-65j5q"] Apr 17 11:16:42.901271 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:16:42.901206 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfa96c8c_1c5c_4749_a74a_b6eab2274afd.slice/crio-efd60cc90ed924993bf781f0556cde858320422f28e2aa8407976e017da44b44 WatchSource:0}: Error finding container efd60cc90ed924993bf781f0556cde858320422f28e2aa8407976e017da44b44: Status 404 returned error can't find the container with id efd60cc90ed924993bf781f0556cde858320422f28e2aa8407976e017da44b44 Apr 17 11:16:42.955366 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.955321 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-xjjnj"] Apr 17 11:16:42.962714 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.962197 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-799db9ddcb-l92ps"] Apr 17 11:16:42.974192 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:16:42.974168 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbeb961a2_0877_4cd8_be68_062f895cef5d.slice/crio-964f2f4bb0ee9a592f102bf52c23c2d1fb1f9c8398df6a134d5bc1331abef4d4 WatchSource:0}: Error finding container 964f2f4bb0ee9a592f102bf52c23c2d1fb1f9c8398df6a134d5bc1331abef4d4: Status 404 returned error can't find the container with id 964f2f4bb0ee9a592f102bf52c23c2d1fb1f9c8398df6a134d5bc1331abef4d4 Apr 17 11:16:42.980558 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:42.980536 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7657d8478-pp7qf"] Apr 17 11:16:42.982727 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:16:42.982702 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod615b2482_c02f_4752_9ea6_7e7cef5c1fe9.slice/crio-7142abb6fa84df653efc31673dcaae87375f7e2e1de081db27fa3d1bb8733d46 WatchSource:0}: Error finding container 7142abb6fa84df653efc31673dcaae87375f7e2e1de081db27fa3d1bb8733d46: Status 404 returned error can't find the container with id 7142abb6fa84df653efc31673dcaae87375f7e2e1de081db27fa3d1bb8733d46 Apr 17 11:16:43.027933 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:43.027905 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" event={"ID":"beb961a2-0877-4cd8-be68-062f895cef5d","Type":"ContainerStarted","Data":"964f2f4bb0ee9a592f102bf52c23c2d1fb1f9c8398df6a134d5bc1331abef4d4"} Apr 17 11:16:43.029138 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:43.029110 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-65j5q" event={"ID":"cfa96c8c-1c5c-4749-a74a-b6eab2274afd","Type":"ContainerStarted","Data":"efd60cc90ed924993bf781f0556cde858320422f28e2aa8407976e017da44b44"} Apr 17 11:16:43.030689 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:43.030665 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5b9d896675-7lskv" event={"ID":"97fe64f0-9f87-4b25-876e-a59829b69c04","Type":"ContainerStarted","Data":"b8dac2aee625391c5c735c7a4f28bda17a879840dd45a73f8b9bb470668228aa"} Apr 17 11:16:43.030805 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:43.030696 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5b9d896675-7lskv" event={"ID":"97fe64f0-9f87-4b25-876e-a59829b69c04","Type":"ContainerStarted","Data":"5501beda713b23d94077ac80ee75c293d1122a812513488698c10c01e0ede782"} Apr 17 11:16:43.030892 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:43.030812 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5b9d896675-7lskv" Apr 17 11:16:43.031967 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:43.031939 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7lqsd" event={"ID":"a3e70c72-ac65-4a09-b59a-570bf07a6dbb","Type":"ContainerStarted","Data":"cdc1435adf07e412b68123c637092f4ad5c4fecde305de69a6775b1abf271f7e"} Apr 17 11:16:43.032955 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:43.032934 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7657d8478-pp7qf" event={"ID":"615b2482-c02f-4752-9ea6-7e7cef5c1fe9","Type":"ContainerStarted","Data":"7142abb6fa84df653efc31673dcaae87375f7e2e1de081db27fa3d1bb8733d46"} Apr 17 11:16:43.033997 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:43.033976 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-xjjnj" event={"ID":"d90dfee2-676e-4224-8ec8-8d764b523802","Type":"ContainerStarted","Data":"825162c6ac23491ca5c8103aeab601a82fc247e3426f6d35827322cb670dfa2e"} Apr 17 11:16:43.051043 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:43.051004 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5b9d896675-7lskv" podStartSLOduration=62.050989942 podStartE2EDuration="1m2.050989942s" podCreationTimestamp="2026-04-17 11:15:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:16:43.049978421 +0000 UTC m=+65.916502570" watchObservedRunningTime="2026-04-17 11:16:43.050989942 +0000 UTC m=+65.917514086" Apr 17 11:16:43.202801 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:43.202747 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-sswv7"] Apr 17 11:16:43.205899 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:43.205878 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tkcmp"] Apr 17 11:16:43.207520 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:16:43.207483 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05a441fa_9d9b_40d1_adfd_ffe296dfb2d0.slice/crio-b40ad3a7c8d881f125f1aea9b909eeffc775a593cf3eefe7efc21a1c715396e6 WatchSource:0}: Error finding container b40ad3a7c8d881f125f1aea9b909eeffc775a593cf3eefe7efc21a1c715396e6: Status 404 returned error can't find the container with id b40ad3a7c8d881f125f1aea9b909eeffc775a593cf3eefe7efc21a1c715396e6 Apr 17 11:16:43.212654 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:16:43.212289 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dd7876b_a6b7_4cf0_b645_979aead5bdff.slice/crio-b5a466ecc73e529b5923f5d640ba6f717dc7ce6ccbff5f8f69538ac89eba9291 WatchSource:0}: Error finding container b5a466ecc73e529b5923f5d640ba6f717dc7ce6ccbff5f8f69538ac89eba9291: Status 404 returned error can't find the container with id b5a466ecc73e529b5923f5d640ba6f717dc7ce6ccbff5f8f69538ac89eba9291 Apr 17 11:16:43.423324 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:43.423279 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0ba74b24-e523-481e-82b5-080dc7ecb2e2-metrics-certs\") pod \"network-metrics-daemon-9g7pq\" (UID: \"0ba74b24-e523-481e-82b5-080dc7ecb2e2\") " pod="openshift-multus/network-metrics-daemon-9g7pq" Apr 17 11:16:43.432076 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:43.432038 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0ba74b24-e523-481e-82b5-080dc7ecb2e2-metrics-certs\") pod \"network-metrics-daemon-9g7pq\" (UID: \"0ba74b24-e523-481e-82b5-080dc7ecb2e2\") " pod="openshift-multus/network-metrics-daemon-9g7pq" Apr 17 11:16:43.437847 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:43.437587 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-98wmj\"" Apr 17 11:16:43.445220 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:43.445180 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9g7pq" Apr 17 11:16:43.614555 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:43.614519 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9g7pq"] Apr 17 11:16:44.046617 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.046573 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7657d8478-pp7qf" event={"ID":"615b2482-c02f-4752-9ea6-7e7cef5c1fe9","Type":"ContainerStarted","Data":"d1d9fbb7f2178c1cb312a21c891de0cf430e0df52a09614e9825b63066e0c886"} Apr 17 11:16:44.054203 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.054164 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" event={"ID":"beb961a2-0877-4cd8-be68-062f895cef5d","Type":"ContainerStarted","Data":"6ae9733fe580e2c9f90a09ee0c477c0f23172370d85ce4a945c0f22de9e643ab"} Apr 17 11:16:44.054851 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.054821 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" Apr 17 11:16:44.058248 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.058194 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tkcmp" event={"ID":"4dd7876b-a6b7-4cf0-b645-979aead5bdff","Type":"ContainerStarted","Data":"b5a466ecc73e529b5923f5d640ba6f717dc7ce6ccbff5f8f69538ac89eba9291"} Apr 17 11:16:44.064158 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.064093 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9g7pq" event={"ID":"0ba74b24-e523-481e-82b5-080dc7ecb2e2","Type":"ContainerStarted","Data":"6b6c0367d0d3c943df68c9ae7124041ee41c2d355b18bc3cbab7476ac01f708c"} Apr 17 11:16:44.065645 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.065622 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-sswv7" event={"ID":"05a441fa-9d9b-40d1-adfd-ffe296dfb2d0","Type":"ContainerStarted","Data":"b40ad3a7c8d881f125f1aea9b909eeffc775a593cf3eefe7efc21a1c715396e6"} Apr 17 11:16:44.090622 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.089117 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-7657d8478-pp7qf" podStartSLOduration=63.089099592 podStartE2EDuration="1m3.089099592s" podCreationTimestamp="2026-04-17 11:15:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:16:44.068784095 +0000 UTC m=+66.935308237" watchObservedRunningTime="2026-04-17 11:16:44.089099592 +0000 UTC m=+66.955623742" Apr 17 11:16:44.090622 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.089951 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" podStartSLOduration=66.089937274 podStartE2EDuration="1m6.089937274s" podCreationTimestamp="2026-04-17 11:15:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:16:44.088678479 +0000 UTC m=+66.955202680" watchObservedRunningTime="2026-04-17 11:16:44.089937274 +0000 UTC m=+66.956461426" Apr 17 11:16:44.660970 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.660563 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77b46b789c-mmfkx"] Apr 17 11:16:44.665917 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.665890 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-856d477c55-5b49d"] Apr 17 11:16:44.666165 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.666144 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77b46b789c-mmfkx" Apr 17 11:16:44.668436 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.668290 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 11:16:44.668436 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.668392 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-5fmpg\"" Apr 17 11:16:44.668654 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.668524 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 11:16:44.668828 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.668812 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 11:16:44.668889 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.668871 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 17 11:16:44.669839 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.669329 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-8bbc9998b-zxgrx"] Apr 17 11:16:44.669839 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.669474 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-856d477c55-5b49d" Apr 17 11:16:44.672642 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.672623 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8bbc9998b-zxgrx" Apr 17 11:16:44.676143 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.675893 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 17 11:16:44.677014 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.676730 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 17 11:16:44.677014 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.676976 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 17 11:16:44.680159 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.680137 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-856d477c55-5b49d"] Apr 17 11:16:44.680314 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.680297 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 17 11:16:44.680562 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.680544 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 17 11:16:44.682863 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.682826 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77b46b789c-mmfkx"] Apr 17 11:16:44.682985 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.682948 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-8bbc9998b-zxgrx"] Apr 17 11:16:44.701161 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.701110 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7657d8478-pp7qf" Apr 17 11:16:44.704092 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.704071 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-7657d8478-pp7qf" Apr 17 11:16:44.783134 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.783105 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-654774794f-gl7h8"] Apr 17 11:16:44.786241 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.786219 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-654774794f-gl7h8" Apr 17 11:16:44.788666 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.788641 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 11:16:44.788836 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.788818 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 11:16:44.788904 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.788860 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 11:16:44.788904 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.788876 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-w9dfd\"" Apr 17 11:16:44.788904 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.788892 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 11:16:44.789145 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.789102 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 11:16:44.789235 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.789154 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 11:16:44.789235 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.789227 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 11:16:44.791132 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.791114 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-cqq7h"] Apr 17 11:16:44.794270 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.794241 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-cqq7h" Apr 17 11:16:44.798030 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.797993 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-qpw62\"" Apr 17 11:16:44.803976 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.803833 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5b9d896675-7lskv"] Apr 17 11:16:44.808382 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.808361 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-654774794f-gl7h8"] Apr 17 11:16:44.812032 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.812008 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-cqq7h"] Apr 17 11:16:44.835820 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.835791 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4c6a4a34-e15a-4316-9c07-e21a8e277aae-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-77b46b789c-mmfkx\" (UID: \"4c6a4a34-e15a-4316-9c07-e21a8e277aae\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77b46b789c-mmfkx" Apr 17 11:16:44.835984 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.835838 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l95f\" (UniqueName: \"kubernetes.io/projected/186f7966-b1a1-4e40-942f-e9cb89cda1bb-kube-api-access-7l95f\") pod \"klusterlet-addon-workmgr-8bbc9998b-zxgrx\" (UID: \"186f7966-b1a1-4e40-942f-e9cb89cda1bb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8bbc9998b-zxgrx" Apr 17 11:16:44.835984 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.835897 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/26cf1831-1529-4ba7-acc9-63f6e4e1b700-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-856d477c55-5b49d\" (UID: \"26cf1831-1529-4ba7-acc9-63f6e4e1b700\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-856d477c55-5b49d" Apr 17 11:16:44.835984 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.835963 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/186f7966-b1a1-4e40-942f-e9cb89cda1bb-tmp\") pod \"klusterlet-addon-workmgr-8bbc9998b-zxgrx\" (UID: \"186f7966-b1a1-4e40-942f-e9cb89cda1bb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8bbc9998b-zxgrx" Apr 17 11:16:44.836145 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.836059 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/26cf1831-1529-4ba7-acc9-63f6e4e1b700-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-856d477c55-5b49d\" (UID: \"26cf1831-1529-4ba7-acc9-63f6e4e1b700\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-856d477c55-5b49d" Apr 17 11:16:44.836145 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.836092 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/186f7966-b1a1-4e40-942f-e9cb89cda1bb-klusterlet-config\") pod \"klusterlet-addon-workmgr-8bbc9998b-zxgrx\" (UID: \"186f7966-b1a1-4e40-942f-e9cb89cda1bb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8bbc9998b-zxgrx" Apr 17 11:16:44.836145 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.836126 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/26cf1831-1529-4ba7-acc9-63f6e4e1b700-hub\") pod \"cluster-proxy-proxy-agent-856d477c55-5b49d\" (UID: \"26cf1831-1529-4ba7-acc9-63f6e4e1b700\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-856d477c55-5b49d" Apr 17 11:16:44.836292 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.836175 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/26cf1831-1529-4ba7-acc9-63f6e4e1b700-ca\") pod \"cluster-proxy-proxy-agent-856d477c55-5b49d\" (UID: \"26cf1831-1529-4ba7-acc9-63f6e4e1b700\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-856d477c55-5b49d" Apr 17 11:16:44.836292 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.836206 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs54f\" (UniqueName: \"kubernetes.io/projected/26cf1831-1529-4ba7-acc9-63f6e4e1b700-kube-api-access-vs54f\") pod \"cluster-proxy-proxy-agent-856d477c55-5b49d\" (UID: \"26cf1831-1529-4ba7-acc9-63f6e4e1b700\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-856d477c55-5b49d" Apr 17 11:16:44.836292 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.836222 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlr7p\" (UniqueName: \"kubernetes.io/projected/4c6a4a34-e15a-4316-9c07-e21a8e277aae-kube-api-access-tlr7p\") pod \"managed-serviceaccount-addon-agent-77b46b789c-mmfkx\" (UID: \"4c6a4a34-e15a-4316-9c07-e21a8e277aae\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77b46b789c-mmfkx" Apr 17 11:16:44.836292 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.836244 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/26cf1831-1529-4ba7-acc9-63f6e4e1b700-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-856d477c55-5b49d\" (UID: \"26cf1831-1529-4ba7-acc9-63f6e4e1b700\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-856d477c55-5b49d" Apr 17 11:16:44.899949 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.899915 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-pp8hg"] Apr 17 11:16:44.903492 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.903466 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-pp8hg" Apr 17 11:16:44.905890 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.905857 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-qr57f\"" Apr 17 11:16:44.906025 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.905928 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 11:16:44.906188 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.906172 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 11:16:44.917826 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.917785 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-pp8hg"] Apr 17 11:16:44.937019 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.936981 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4c6a4a34-e15a-4316-9c07-e21a8e277aae-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-77b46b789c-mmfkx\" (UID: \"4c6a4a34-e15a-4316-9c07-e21a8e277aae\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77b46b789c-mmfkx" Apr 17 11:16:44.937178 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.937036 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/76d3f378-0b89-4691-a527-31be2c7922be-console-serving-cert\") pod \"console-654774794f-gl7h8\" (UID: \"76d3f378-0b89-4691-a527-31be2c7922be\") " pod="openshift-console/console-654774794f-gl7h8" Apr 17 11:16:44.937178 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.937080 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7l95f\" (UniqueName: \"kubernetes.io/projected/186f7966-b1a1-4e40-942f-e9cb89cda1bb-kube-api-access-7l95f\") pod \"klusterlet-addon-workmgr-8bbc9998b-zxgrx\" (UID: \"186f7966-b1a1-4e40-942f-e9cb89cda1bb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8bbc9998b-zxgrx" Apr 17 11:16:44.937178 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.937133 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/26cf1831-1529-4ba7-acc9-63f6e4e1b700-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-856d477c55-5b49d\" (UID: \"26cf1831-1529-4ba7-acc9-63f6e4e1b700\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-856d477c55-5b49d" Apr 17 11:16:44.937178 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.937163 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88rkj\" (UniqueName: \"kubernetes.io/projected/9473d957-8483-46e1-86cf-c51754d8c054-kube-api-access-88rkj\") pod \"downloads-6bcc868b7-cqq7h\" (UID: \"9473d957-8483-46e1-86cf-c51754d8c054\") " pod="openshift-console/downloads-6bcc868b7-cqq7h" Apr 17 11:16:44.937386 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.937192 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk4fh\" (UniqueName: \"kubernetes.io/projected/76d3f378-0b89-4691-a527-31be2c7922be-kube-api-access-qk4fh\") pod \"console-654774794f-gl7h8\" (UID: \"76d3f378-0b89-4691-a527-31be2c7922be\") " pod="openshift-console/console-654774794f-gl7h8" Apr 17 11:16:44.937386 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.937222 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/76d3f378-0b89-4691-a527-31be2c7922be-oauth-serving-cert\") pod \"console-654774794f-gl7h8\" (UID: \"76d3f378-0b89-4691-a527-31be2c7922be\") " pod="openshift-console/console-654774794f-gl7h8" Apr 17 11:16:44.937386 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.937251 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/186f7966-b1a1-4e40-942f-e9cb89cda1bb-tmp\") pod \"klusterlet-addon-workmgr-8bbc9998b-zxgrx\" (UID: \"186f7966-b1a1-4e40-942f-e9cb89cda1bb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8bbc9998b-zxgrx" Apr 17 11:16:44.937386 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.937288 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/26cf1831-1529-4ba7-acc9-63f6e4e1b700-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-856d477c55-5b49d\" (UID: \"26cf1831-1529-4ba7-acc9-63f6e4e1b700\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-856d477c55-5b49d" Apr 17 11:16:44.937386 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.937314 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/186f7966-b1a1-4e40-942f-e9cb89cda1bb-klusterlet-config\") pod \"klusterlet-addon-workmgr-8bbc9998b-zxgrx\" (UID: \"186f7966-b1a1-4e40-942f-e9cb89cda1bb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8bbc9998b-zxgrx" Apr 17 11:16:44.937386 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.937341 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/76d3f378-0b89-4691-a527-31be2c7922be-service-ca\") pod \"console-654774794f-gl7h8\" (UID: \"76d3f378-0b89-4691-a527-31be2c7922be\") " pod="openshift-console/console-654774794f-gl7h8" Apr 17 11:16:44.937386 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.937374 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/26cf1831-1529-4ba7-acc9-63f6e4e1b700-hub\") pod \"cluster-proxy-proxy-agent-856d477c55-5b49d\" (UID: \"26cf1831-1529-4ba7-acc9-63f6e4e1b700\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-856d477c55-5b49d" Apr 17 11:16:44.937664 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.937398 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/76d3f378-0b89-4691-a527-31be2c7922be-console-config\") pod \"console-654774794f-gl7h8\" (UID: \"76d3f378-0b89-4691-a527-31be2c7922be\") " pod="openshift-console/console-654774794f-gl7h8" Apr 17 11:16:44.937664 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.937423 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/76d3f378-0b89-4691-a527-31be2c7922be-console-oauth-config\") pod \"console-654774794f-gl7h8\" (UID: \"76d3f378-0b89-4691-a527-31be2c7922be\") " pod="openshift-console/console-654774794f-gl7h8" Apr 17 11:16:44.937664 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.937457 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/26cf1831-1529-4ba7-acc9-63f6e4e1b700-ca\") pod \"cluster-proxy-proxy-agent-856d477c55-5b49d\" (UID: \"26cf1831-1529-4ba7-acc9-63f6e4e1b700\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-856d477c55-5b49d" Apr 17 11:16:44.937664 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.937483 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vs54f\" (UniqueName: \"kubernetes.io/projected/26cf1831-1529-4ba7-acc9-63f6e4e1b700-kube-api-access-vs54f\") pod \"cluster-proxy-proxy-agent-856d477c55-5b49d\" (UID: \"26cf1831-1529-4ba7-acc9-63f6e4e1b700\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-856d477c55-5b49d" Apr 17 11:16:44.937664 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.937511 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tlr7p\" (UniqueName: \"kubernetes.io/projected/4c6a4a34-e15a-4316-9c07-e21a8e277aae-kube-api-access-tlr7p\") pod \"managed-serviceaccount-addon-agent-77b46b789c-mmfkx\" (UID: \"4c6a4a34-e15a-4316-9c07-e21a8e277aae\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77b46b789c-mmfkx" Apr 17 11:16:44.937664 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.937553 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/26cf1831-1529-4ba7-acc9-63f6e4e1b700-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-856d477c55-5b49d\" (UID: \"26cf1831-1529-4ba7-acc9-63f6e4e1b700\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-856d477c55-5b49d" Apr 17 11:16:44.939638 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.938388 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/186f7966-b1a1-4e40-942f-e9cb89cda1bb-tmp\") pod \"klusterlet-addon-workmgr-8bbc9998b-zxgrx\" (UID: \"186f7966-b1a1-4e40-942f-e9cb89cda1bb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8bbc9998b-zxgrx" Apr 17 11:16:44.939638 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.939233 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/26cf1831-1529-4ba7-acc9-63f6e4e1b700-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-856d477c55-5b49d\" (UID: \"26cf1831-1529-4ba7-acc9-63f6e4e1b700\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-856d477c55-5b49d" Apr 17 11:16:44.940501 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.940478 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/186f7966-b1a1-4e40-942f-e9cb89cda1bb-klusterlet-config\") pod \"klusterlet-addon-workmgr-8bbc9998b-zxgrx\" (UID: \"186f7966-b1a1-4e40-942f-e9cb89cda1bb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8bbc9998b-zxgrx" Apr 17 11:16:44.940758 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.940740 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/26cf1831-1529-4ba7-acc9-63f6e4e1b700-ca\") pod \"cluster-proxy-proxy-agent-856d477c55-5b49d\" (UID: \"26cf1831-1529-4ba7-acc9-63f6e4e1b700\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-856d477c55-5b49d" Apr 17 11:16:44.941084 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.941050 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/26cf1831-1529-4ba7-acc9-63f6e4e1b700-hub\") pod \"cluster-proxy-proxy-agent-856d477c55-5b49d\" (UID: \"26cf1831-1529-4ba7-acc9-63f6e4e1b700\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-856d477c55-5b49d" Apr 17 11:16:44.941169 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.941146 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/26cf1831-1529-4ba7-acc9-63f6e4e1b700-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-856d477c55-5b49d\" (UID: \"26cf1831-1529-4ba7-acc9-63f6e4e1b700\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-856d477c55-5b49d" Apr 17 11:16:44.941711 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.941692 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4c6a4a34-e15a-4316-9c07-e21a8e277aae-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-77b46b789c-mmfkx\" (UID: \"4c6a4a34-e15a-4316-9c07-e21a8e277aae\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77b46b789c-mmfkx" Apr 17 11:16:44.941940 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.941921 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/26cf1831-1529-4ba7-acc9-63f6e4e1b700-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-856d477c55-5b49d\" (UID: \"26cf1831-1529-4ba7-acc9-63f6e4e1b700\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-856d477c55-5b49d" Apr 17 11:16:44.954031 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.954004 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs54f\" (UniqueName: \"kubernetes.io/projected/26cf1831-1529-4ba7-acc9-63f6e4e1b700-kube-api-access-vs54f\") pod \"cluster-proxy-proxy-agent-856d477c55-5b49d\" (UID: \"26cf1831-1529-4ba7-acc9-63f6e4e1b700\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-856d477c55-5b49d" Apr 17 11:16:44.955013 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.954968 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlr7p\" (UniqueName: \"kubernetes.io/projected/4c6a4a34-e15a-4316-9c07-e21a8e277aae-kube-api-access-tlr7p\") pod \"managed-serviceaccount-addon-agent-77b46b789c-mmfkx\" (UID: \"4c6a4a34-e15a-4316-9c07-e21a8e277aae\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77b46b789c-mmfkx" Apr 17 11:16:44.955151 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.955128 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l95f\" (UniqueName: \"kubernetes.io/projected/186f7966-b1a1-4e40-942f-e9cb89cda1bb-kube-api-access-7l95f\") pod \"klusterlet-addon-workmgr-8bbc9998b-zxgrx\" (UID: \"186f7966-b1a1-4e40-942f-e9cb89cda1bb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8bbc9998b-zxgrx" Apr 17 11:16:44.995463 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:44.995426 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77b46b789c-mmfkx" Apr 17 11:16:45.005370 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:45.005346 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-856d477c55-5b49d" Apr 17 11:16:45.022041 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:45.022018 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8bbc9998b-zxgrx" Apr 17 11:16:45.037926 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:45.037898 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-88rkj\" (UniqueName: \"kubernetes.io/projected/9473d957-8483-46e1-86cf-c51754d8c054-kube-api-access-88rkj\") pod \"downloads-6bcc868b7-cqq7h\" (UID: \"9473d957-8483-46e1-86cf-c51754d8c054\") " pod="openshift-console/downloads-6bcc868b7-cqq7h" Apr 17 11:16:45.038046 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:45.037933 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qk4fh\" (UniqueName: \"kubernetes.io/projected/76d3f378-0b89-4691-a527-31be2c7922be-kube-api-access-qk4fh\") pod \"console-654774794f-gl7h8\" (UID: \"76d3f378-0b89-4691-a527-31be2c7922be\") " pod="openshift-console/console-654774794f-gl7h8" Apr 17 11:16:45.038046 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:45.037953 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/76d3f378-0b89-4691-a527-31be2c7922be-oauth-serving-cert\") pod \"console-654774794f-gl7h8\" (UID: \"76d3f378-0b89-4691-a527-31be2c7922be\") " pod="openshift-console/console-654774794f-gl7h8" Apr 17 11:16:45.038046 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:45.037979 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/acfe3db2-72d7-40a4-bd0f-d52828be6509-crio-socket\") pod \"insights-runtime-extractor-pp8hg\" (UID: \"acfe3db2-72d7-40a4-bd0f-d52828be6509\") " pod="openshift-insights/insights-runtime-extractor-pp8hg" Apr 17 11:16:45.038046 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:45.038007 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/76d3f378-0b89-4691-a527-31be2c7922be-service-ca\") pod \"console-654774794f-gl7h8\" (UID: \"76d3f378-0b89-4691-a527-31be2c7922be\") " pod="openshift-console/console-654774794f-gl7h8" Apr 17 11:16:45.038260 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:45.038105 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/76d3f378-0b89-4691-a527-31be2c7922be-console-config\") pod \"console-654774794f-gl7h8\" (UID: \"76d3f378-0b89-4691-a527-31be2c7922be\") " pod="openshift-console/console-654774794f-gl7h8" Apr 17 11:16:45.038260 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:45.038137 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/76d3f378-0b89-4691-a527-31be2c7922be-console-oauth-config\") pod \"console-654774794f-gl7h8\" (UID: \"76d3f378-0b89-4691-a527-31be2c7922be\") " pod="openshift-console/console-654774794f-gl7h8" Apr 17 11:16:45.038260 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:45.038177 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/acfe3db2-72d7-40a4-bd0f-d52828be6509-data-volume\") pod \"insights-runtime-extractor-pp8hg\" (UID: \"acfe3db2-72d7-40a4-bd0f-d52828be6509\") " pod="openshift-insights/insights-runtime-extractor-pp8hg" Apr 17 11:16:45.038260 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:45.038245 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/76d3f378-0b89-4691-a527-31be2c7922be-console-serving-cert\") pod \"console-654774794f-gl7h8\" (UID: \"76d3f378-0b89-4691-a527-31be2c7922be\") " pod="openshift-console/console-654774794f-gl7h8" Apr 17 11:16:45.038533 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:45.038281 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/acfe3db2-72d7-40a4-bd0f-d52828be6509-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pp8hg\" (UID: \"acfe3db2-72d7-40a4-bd0f-d52828be6509\") " pod="openshift-insights/insights-runtime-extractor-pp8hg" Apr 17 11:16:45.038533 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:45.038313 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bw5k\" (UniqueName: \"kubernetes.io/projected/acfe3db2-72d7-40a4-bd0f-d52828be6509-kube-api-access-5bw5k\") pod \"insights-runtime-extractor-pp8hg\" (UID: \"acfe3db2-72d7-40a4-bd0f-d52828be6509\") " pod="openshift-insights/insights-runtime-extractor-pp8hg" Apr 17 11:16:45.038533 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:45.038340 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/acfe3db2-72d7-40a4-bd0f-d52828be6509-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pp8hg\" (UID: \"acfe3db2-72d7-40a4-bd0f-d52828be6509\") " pod="openshift-insights/insights-runtime-extractor-pp8hg" Apr 17 11:16:45.038798 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:45.038754 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/76d3f378-0b89-4691-a527-31be2c7922be-oauth-serving-cert\") pod \"console-654774794f-gl7h8\" (UID: \"76d3f378-0b89-4691-a527-31be2c7922be\") " pod="openshift-console/console-654774794f-gl7h8" Apr 17 11:16:45.038905 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:45.038863 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/76d3f378-0b89-4691-a527-31be2c7922be-service-ca\") pod \"console-654774794f-gl7h8\" (UID: \"76d3f378-0b89-4691-a527-31be2c7922be\") " pod="openshift-console/console-654774794f-gl7h8" Apr 17 11:16:45.039213 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:45.039194 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/76d3f378-0b89-4691-a527-31be2c7922be-console-config\") pod \"console-654774794f-gl7h8\" (UID: \"76d3f378-0b89-4691-a527-31be2c7922be\") " pod="openshift-console/console-654774794f-gl7h8" Apr 17 11:16:45.040602 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:45.040580 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/76d3f378-0b89-4691-a527-31be2c7922be-console-oauth-config\") pod \"console-654774794f-gl7h8\" (UID: \"76d3f378-0b89-4691-a527-31be2c7922be\") " pod="openshift-console/console-654774794f-gl7h8" Apr 17 11:16:45.040692 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:45.040656 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/76d3f378-0b89-4691-a527-31be2c7922be-console-serving-cert\") pod \"console-654774794f-gl7h8\" (UID: \"76d3f378-0b89-4691-a527-31be2c7922be\") " pod="openshift-console/console-654774794f-gl7h8" Apr 17 11:16:45.047173 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:45.047127 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk4fh\" (UniqueName: \"kubernetes.io/projected/76d3f378-0b89-4691-a527-31be2c7922be-kube-api-access-qk4fh\") pod \"console-654774794f-gl7h8\" (UID: \"76d3f378-0b89-4691-a527-31be2c7922be\") " pod="openshift-console/console-654774794f-gl7h8" Apr 17 11:16:45.048231 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:45.048211 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-88rkj\" (UniqueName: \"kubernetes.io/projected/9473d957-8483-46e1-86cf-c51754d8c054-kube-api-access-88rkj\") pod \"downloads-6bcc868b7-cqq7h\" (UID: \"9473d957-8483-46e1-86cf-c51754d8c054\") " pod="openshift-console/downloads-6bcc868b7-cqq7h" Apr 17 11:16:45.069214 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:45.069188 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-7657d8478-pp7qf" Apr 17 11:16:45.070444 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:45.070422 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-7657d8478-pp7qf" Apr 17 11:16:45.098107 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:45.098067 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-654774794f-gl7h8" Apr 17 11:16:45.107877 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:45.107824 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-cqq7h" Apr 17 11:16:45.138982 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:45.138928 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/acfe3db2-72d7-40a4-bd0f-d52828be6509-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pp8hg\" (UID: \"acfe3db2-72d7-40a4-bd0f-d52828be6509\") " pod="openshift-insights/insights-runtime-extractor-pp8hg" Apr 17 11:16:45.139139 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:45.139084 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5bw5k\" (UniqueName: \"kubernetes.io/projected/acfe3db2-72d7-40a4-bd0f-d52828be6509-kube-api-access-5bw5k\") pod \"insights-runtime-extractor-pp8hg\" (UID: \"acfe3db2-72d7-40a4-bd0f-d52828be6509\") " pod="openshift-insights/insights-runtime-extractor-pp8hg" Apr 17 11:16:45.139198 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:45.139134 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/acfe3db2-72d7-40a4-bd0f-d52828be6509-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pp8hg\" (UID: \"acfe3db2-72d7-40a4-bd0f-d52828be6509\") " pod="openshift-insights/insights-runtime-extractor-pp8hg" Apr 17 11:16:45.139268 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:45.139210 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/acfe3db2-72d7-40a4-bd0f-d52828be6509-crio-socket\") pod \"insights-runtime-extractor-pp8hg\" (UID: \"acfe3db2-72d7-40a4-bd0f-d52828be6509\") " pod="openshift-insights/insights-runtime-extractor-pp8hg" Apr 17 11:16:45.139320 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:45.139271 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/acfe3db2-72d7-40a4-bd0f-d52828be6509-data-volume\") pod \"insights-runtime-extractor-pp8hg\" (UID: \"acfe3db2-72d7-40a4-bd0f-d52828be6509\") " pod="openshift-insights/insights-runtime-extractor-pp8hg" Apr 17 11:16:45.139429 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:45.139407 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/acfe3db2-72d7-40a4-bd0f-d52828be6509-crio-socket\") pod \"insights-runtime-extractor-pp8hg\" (UID: \"acfe3db2-72d7-40a4-bd0f-d52828be6509\") " pod="openshift-insights/insights-runtime-extractor-pp8hg" Apr 17 11:16:45.139566 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:45.139515 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/acfe3db2-72d7-40a4-bd0f-d52828be6509-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pp8hg\" (UID: \"acfe3db2-72d7-40a4-bd0f-d52828be6509\") " pod="openshift-insights/insights-runtime-extractor-pp8hg" Apr 17 11:16:45.139566 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:45.139564 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/acfe3db2-72d7-40a4-bd0f-d52828be6509-data-volume\") pod \"insights-runtime-extractor-pp8hg\" (UID: \"acfe3db2-72d7-40a4-bd0f-d52828be6509\") " pod="openshift-insights/insights-runtime-extractor-pp8hg" Apr 17 11:16:45.141596 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:45.141572 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/acfe3db2-72d7-40a4-bd0f-d52828be6509-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pp8hg\" (UID: \"acfe3db2-72d7-40a4-bd0f-d52828be6509\") " pod="openshift-insights/insights-runtime-extractor-pp8hg" Apr 17 11:16:45.150383 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:45.150364 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bw5k\" (UniqueName: \"kubernetes.io/projected/acfe3db2-72d7-40a4-bd0f-d52828be6509-kube-api-access-5bw5k\") pod \"insights-runtime-extractor-pp8hg\" (UID: \"acfe3db2-72d7-40a4-bd0f-d52828be6509\") " pod="openshift-insights/insights-runtime-extractor-pp8hg" Apr 17 11:16:45.214619 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:45.214582 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-pp8hg" Apr 17 11:16:46.342637 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:46.342492 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-57b77fd4b5-89vv8"] Apr 17 11:16:46.349485 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:46.349457 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57b77fd4b5-89vv8" Apr 17 11:16:46.355354 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:46.355323 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-57b77fd4b5-89vv8"] Apr 17 11:16:46.361878 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:46.361846 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 11:16:46.450809 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:46.450759 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/316ca251-c7ff-4d30-9524-f9a64784a1ac-oauth-serving-cert\") pod \"console-57b77fd4b5-89vv8\" (UID: \"316ca251-c7ff-4d30-9524-f9a64784a1ac\") " pod="openshift-console/console-57b77fd4b5-89vv8" Apr 17 11:16:46.450947 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:46.450831 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/316ca251-c7ff-4d30-9524-f9a64784a1ac-console-config\") pod \"console-57b77fd4b5-89vv8\" (UID: \"316ca251-c7ff-4d30-9524-f9a64784a1ac\") " pod="openshift-console/console-57b77fd4b5-89vv8" Apr 17 11:16:46.450947 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:46.450886 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcw7w\" (UniqueName: \"kubernetes.io/projected/316ca251-c7ff-4d30-9524-f9a64784a1ac-kube-api-access-kcw7w\") pod \"console-57b77fd4b5-89vv8\" (UID: \"316ca251-c7ff-4d30-9524-f9a64784a1ac\") " pod="openshift-console/console-57b77fd4b5-89vv8" Apr 17 11:16:46.451014 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:46.450952 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/316ca251-c7ff-4d30-9524-f9a64784a1ac-console-serving-cert\") pod \"console-57b77fd4b5-89vv8\" (UID: \"316ca251-c7ff-4d30-9524-f9a64784a1ac\") " pod="openshift-console/console-57b77fd4b5-89vv8" Apr 17 11:16:46.451014 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:46.450994 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/316ca251-c7ff-4d30-9524-f9a64784a1ac-trusted-ca-bundle\") pod \"console-57b77fd4b5-89vv8\" (UID: \"316ca251-c7ff-4d30-9524-f9a64784a1ac\") " pod="openshift-console/console-57b77fd4b5-89vv8" Apr 17 11:16:46.451084 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:46.451047 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/316ca251-c7ff-4d30-9524-f9a64784a1ac-service-ca\") pod \"console-57b77fd4b5-89vv8\" (UID: \"316ca251-c7ff-4d30-9524-f9a64784a1ac\") " pod="openshift-console/console-57b77fd4b5-89vv8" Apr 17 11:16:46.451084 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:46.451066 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/316ca251-c7ff-4d30-9524-f9a64784a1ac-console-oauth-config\") pod \"console-57b77fd4b5-89vv8\" (UID: \"316ca251-c7ff-4d30-9524-f9a64784a1ac\") " pod="openshift-console/console-57b77fd4b5-89vv8" Apr 17 11:16:46.552527 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:46.552488 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/316ca251-c7ff-4d30-9524-f9a64784a1ac-oauth-serving-cert\") pod \"console-57b77fd4b5-89vv8\" (UID: \"316ca251-c7ff-4d30-9524-f9a64784a1ac\") " pod="openshift-console/console-57b77fd4b5-89vv8" Apr 17 11:16:46.552698 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:46.552544 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/316ca251-c7ff-4d30-9524-f9a64784a1ac-console-config\") pod \"console-57b77fd4b5-89vv8\" (UID: \"316ca251-c7ff-4d30-9524-f9a64784a1ac\") " pod="openshift-console/console-57b77fd4b5-89vv8" Apr 17 11:16:46.552698 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:46.552590 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcw7w\" (UniqueName: \"kubernetes.io/projected/316ca251-c7ff-4d30-9524-f9a64784a1ac-kube-api-access-kcw7w\") pod \"console-57b77fd4b5-89vv8\" (UID: \"316ca251-c7ff-4d30-9524-f9a64784a1ac\") " pod="openshift-console/console-57b77fd4b5-89vv8" Apr 17 11:16:46.552698 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:46.552615 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/316ca251-c7ff-4d30-9524-f9a64784a1ac-console-serving-cert\") pod \"console-57b77fd4b5-89vv8\" (UID: \"316ca251-c7ff-4d30-9524-f9a64784a1ac\") " pod="openshift-console/console-57b77fd4b5-89vv8" Apr 17 11:16:46.552888 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:46.552814 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/316ca251-c7ff-4d30-9524-f9a64784a1ac-trusted-ca-bundle\") pod \"console-57b77fd4b5-89vv8\" (UID: \"316ca251-c7ff-4d30-9524-f9a64784a1ac\") " pod="openshift-console/console-57b77fd4b5-89vv8" Apr 17 11:16:46.552888 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:46.552861 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/316ca251-c7ff-4d30-9524-f9a64784a1ac-service-ca\") pod \"console-57b77fd4b5-89vv8\" (UID: \"316ca251-c7ff-4d30-9524-f9a64784a1ac\") " pod="openshift-console/console-57b77fd4b5-89vv8" Apr 17 11:16:46.552976 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:46.552891 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/316ca251-c7ff-4d30-9524-f9a64784a1ac-console-oauth-config\") pod \"console-57b77fd4b5-89vv8\" (UID: \"316ca251-c7ff-4d30-9524-f9a64784a1ac\") " pod="openshift-console/console-57b77fd4b5-89vv8" Apr 17 11:16:46.553393 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:46.553359 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/316ca251-c7ff-4d30-9524-f9a64784a1ac-oauth-serving-cert\") pod \"console-57b77fd4b5-89vv8\" (UID: \"316ca251-c7ff-4d30-9524-f9a64784a1ac\") " pod="openshift-console/console-57b77fd4b5-89vv8" Apr 17 11:16:46.553525 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:46.553475 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/316ca251-c7ff-4d30-9524-f9a64784a1ac-console-config\") pod \"console-57b77fd4b5-89vv8\" (UID: \"316ca251-c7ff-4d30-9524-f9a64784a1ac\") " pod="openshift-console/console-57b77fd4b5-89vv8" Apr 17 11:16:46.553785 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:46.553747 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/316ca251-c7ff-4d30-9524-f9a64784a1ac-service-ca\") pod \"console-57b77fd4b5-89vv8\" (UID: \"316ca251-c7ff-4d30-9524-f9a64784a1ac\") " pod="openshift-console/console-57b77fd4b5-89vv8" Apr 17 11:16:46.554083 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:46.554052 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/316ca251-c7ff-4d30-9524-f9a64784a1ac-trusted-ca-bundle\") pod \"console-57b77fd4b5-89vv8\" (UID: \"316ca251-c7ff-4d30-9524-f9a64784a1ac\") " pod="openshift-console/console-57b77fd4b5-89vv8" Apr 17 11:16:46.555673 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:46.555648 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/316ca251-c7ff-4d30-9524-f9a64784a1ac-console-oauth-config\") pod \"console-57b77fd4b5-89vv8\" (UID: \"316ca251-c7ff-4d30-9524-f9a64784a1ac\") " pod="openshift-console/console-57b77fd4b5-89vv8" Apr 17 11:16:46.555673 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:46.555657 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/316ca251-c7ff-4d30-9524-f9a64784a1ac-console-serving-cert\") pod \"console-57b77fd4b5-89vv8\" (UID: \"316ca251-c7ff-4d30-9524-f9a64784a1ac\") " pod="openshift-console/console-57b77fd4b5-89vv8" Apr 17 11:16:46.560600 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:46.560573 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcw7w\" (UniqueName: \"kubernetes.io/projected/316ca251-c7ff-4d30-9524-f9a64784a1ac-kube-api-access-kcw7w\") pod \"console-57b77fd4b5-89vv8\" (UID: \"316ca251-c7ff-4d30-9524-f9a64784a1ac\") " pod="openshift-console/console-57b77fd4b5-89vv8" Apr 17 11:16:46.666905 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:46.666862 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57b77fd4b5-89vv8" Apr 17 11:16:47.034225 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:47.034167 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-8bbc9998b-zxgrx"] Apr 17 11:16:47.073843 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:16:47.067036 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod186f7966_b1a1_4e40_942f_e9cb89cda1bb.slice/crio-320de447a8c2dfad02772b54d46137ab761d3b6520987caf0aada4fc17fe9249 WatchSource:0}: Error finding container 320de447a8c2dfad02772b54d46137ab761d3b6520987caf0aada4fc17fe9249: Status 404 returned error can't find the container with id 320de447a8c2dfad02772b54d46137ab761d3b6520987caf0aada4fc17fe9249 Apr 17 11:16:47.073843 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:47.068322 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-856d477c55-5b49d"] Apr 17 11:16:47.087552 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:16:47.087512 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26cf1831_1529_4ba7_acc9_63f6e4e1b700.slice/crio-1cce12a4c628298db2782cad6ed4766e39189044e221ce5aaa6e3e63d4c62f7c WatchSource:0}: Error finding container 1cce12a4c628298db2782cad6ed4766e39189044e221ce5aaa6e3e63d4c62f7c: Status 404 returned error can't find the container with id 1cce12a4c628298db2782cad6ed4766e39189044e221ce5aaa6e3e63d4c62f7c Apr 17 11:16:47.134286 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:47.134234 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77b46b789c-mmfkx"] Apr 17 11:16:47.139582 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:47.138123 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-sswv7" event={"ID":"05a441fa-9d9b-40d1-adfd-ffe296dfb2d0","Type":"ContainerStarted","Data":"21c9057a1de262b050b1f25d6f92f6f19ff07a7713f62fb78c189c85106d673e"} Apr 17 11:16:47.142552 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:47.142493 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8bbc9998b-zxgrx" event={"ID":"186f7966-b1a1-4e40-942f-e9cb89cda1bb","Type":"ContainerStarted","Data":"320de447a8c2dfad02772b54d46137ab761d3b6520987caf0aada4fc17fe9249"} Apr 17 11:16:47.145727 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:47.145066 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-xjjnj" event={"ID":"d90dfee2-676e-4224-8ec8-8d764b523802","Type":"ContainerStarted","Data":"ec105a0e8e3fcdfdc2403f7d3a6a753f972fba4261dffb93f7b31ef7f6026695"} Apr 17 11:16:47.147444 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:16:47.147415 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c6a4a34_e15a_4316_9c07_e21a8e277aae.slice/crio-2a0aebc3209ddf84c6ca02b438544ed0aae98b7a4de62aece992ebd403a6608f WatchSource:0}: Error finding container 2a0aebc3209ddf84c6ca02b438544ed0aae98b7a4de62aece992ebd403a6608f: Status 404 returned error can't find the container with id 2a0aebc3209ddf84c6ca02b438544ed0aae98b7a4de62aece992ebd403a6608f Apr 17 11:16:47.148860 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:47.148805 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-65j5q" event={"ID":"cfa96c8c-1c5c-4749-a74a-b6eab2274afd","Type":"ContainerStarted","Data":"9c665e39fd3ab93817669172e2534c410e906b651b549917192bcc4b3be8eae8"} Apr 17 11:16:47.155225 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:47.154547 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-57b77fd4b5-89vv8"] Apr 17 11:16:47.158639 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:47.158560 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-sswv7" podStartSLOduration=33.617492485 podStartE2EDuration="37.158530918s" podCreationTimestamp="2026-04-17 11:16:10 +0000 UTC" firstStartedPulling="2026-04-17 11:16:43.210103192 +0000 UTC m=+66.076627320" lastFinishedPulling="2026-04-17 11:16:46.751141611 +0000 UTC m=+69.617665753" observedRunningTime="2026-04-17 11:16:47.155677357 +0000 UTC m=+70.022201516" watchObservedRunningTime="2026-04-17 11:16:47.158530918 +0000 UTC m=+70.025055070" Apr 17 11:16:47.163194 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:47.162953 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-pp8hg"] Apr 17 11:16:47.184232 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:16:47.179595 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacfe3db2_72d7_40a4_bd0f_d52828be6509.slice/crio-a5a784de67dcb761b0cfc99a660fbb245fe758a6c46e4aa36b722013eb2d9eae WatchSource:0}: Error finding container a5a784de67dcb761b0cfc99a660fbb245fe758a6c46e4aa36b722013eb2d9eae: Status 404 returned error can't find the container with id a5a784de67dcb761b0cfc99a660fbb245fe758a6c46e4aa36b722013eb2d9eae Apr 17 11:16:47.217710 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:47.217209 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-xjjnj" podStartSLOduration=62.46050035 podStartE2EDuration="1m6.217191054s" podCreationTimestamp="2026-04-17 11:15:41 +0000 UTC" firstStartedPulling="2026-04-17 11:16:42.962999498 +0000 UTC m=+65.829523629" lastFinishedPulling="2026-04-17 11:16:46.71969019 +0000 UTC m=+69.586214333" observedRunningTime="2026-04-17 11:16:47.214542473 +0000 UTC m=+70.081066633" watchObservedRunningTime="2026-04-17 11:16:47.217191054 +0000 UTC m=+70.083715205" Apr 17 11:16:47.219550 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:47.219481 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-65j5q" podStartSLOduration=62.411769107 podStartE2EDuration="1m6.219456747s" podCreationTimestamp="2026-04-17 11:15:41 +0000 UTC" firstStartedPulling="2026-04-17 11:16:42.917588727 +0000 UTC m=+65.784112863" lastFinishedPulling="2026-04-17 11:16:46.725276371 +0000 UTC m=+69.591800503" observedRunningTime="2026-04-17 11:16:47.189671026 +0000 UTC m=+70.056195176" watchObservedRunningTime="2026-04-17 11:16:47.219456747 +0000 UTC m=+70.085980908" Apr 17 11:16:47.359786 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:47.359734 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-654774794f-gl7h8"] Apr 17 11:16:47.363347 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:47.362741 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-cqq7h"] Apr 17 11:16:47.563111 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:47.563076 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w9zc9"] Apr 17 11:16:47.566380 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:47.566356 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w9zc9" Apr 17 11:16:47.568281 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:47.568257 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 17 11:16:47.568422 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:47.568367 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-bd86j\"" Apr 17 11:16:47.576943 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:47.576915 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w9zc9"] Apr 17 11:16:47.665915 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:47.665868 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/ddb1e496-a5cc-4d00-9aef-856c4b501cf7-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-w9zc9\" (UID: \"ddb1e496-a5cc-4d00-9aef-856c4b501cf7\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w9zc9" Apr 17 11:16:47.766615 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:47.766534 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/ddb1e496-a5cc-4d00-9aef-856c4b501cf7-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-w9zc9\" (UID: \"ddb1e496-a5cc-4d00-9aef-856c4b501cf7\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w9zc9" Apr 17 11:16:47.766805 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:47.766694 2579 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 17 11:16:47.766805 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:16:47.766760 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddb1e496-a5cc-4d00-9aef-856c4b501cf7-tls-certificates podName:ddb1e496-a5cc-4d00-9aef-856c4b501cf7 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:48.266738431 +0000 UTC m=+71.133262559 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/ddb1e496-a5cc-4d00-9aef-856c4b501cf7-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-w9zc9" (UID: "ddb1e496-a5cc-4d00-9aef-856c4b501cf7") : secret "prometheus-operator-admission-webhook-tls" not found Apr 17 11:16:48.157216 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:48.157167 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-856d477c55-5b49d" event={"ID":"26cf1831-1529-4ba7-acc9-63f6e4e1b700","Type":"ContainerStarted","Data":"1cce12a4c628298db2782cad6ed4766e39189044e221ce5aaa6e3e63d4c62f7c"} Apr 17 11:16:48.160037 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:48.159999 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7lqsd" event={"ID":"a3e70c72-ac65-4a09-b59a-570bf07a6dbb","Type":"ContainerStarted","Data":"2e9f84a36f14567e632ea05c0f15302389d2864f76dcba596cb1e7db7e5b0435"} Apr 17 11:16:48.160037 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:48.160038 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7lqsd" event={"ID":"a3e70c72-ac65-4a09-b59a-570bf07a6dbb","Type":"ContainerStarted","Data":"2aca880ff9b77e37f44715b3706a9dafef5a4114c9b70c4dfb171bedbc01d4cc"} Apr 17 11:16:48.162036 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:48.161968 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-cqq7h" event={"ID":"9473d957-8483-46e1-86cf-c51754d8c054","Type":"ContainerStarted","Data":"256c3d29b9dc3d62ea9fd7003ed12578386205f60dcafe9289c40f53be1103e6"} Apr 17 11:16:48.163916 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:48.163891 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57b77fd4b5-89vv8" event={"ID":"316ca251-c7ff-4d30-9524-f9a64784a1ac","Type":"ContainerStarted","Data":"743ab75fbe9395131b3404d06e9fbe7b559d7d9273dccc56a3357a69d9336b06"} Apr 17 11:16:48.166596 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:48.166562 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77b46b789c-mmfkx" event={"ID":"4c6a4a34-e15a-4316-9c07-e21a8e277aae","Type":"ContainerStarted","Data":"2a0aebc3209ddf84c6ca02b438544ed0aae98b7a4de62aece992ebd403a6608f"} Apr 17 11:16:48.180305 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:48.180189 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tkcmp" event={"ID":"4dd7876b-a6b7-4cf0-b645-979aead5bdff","Type":"ContainerStarted","Data":"94cac0cd9cdef3edc2aa8878bbef9ae794c647e95f51582b220e97a7ba62658b"} Apr 17 11:16:48.180305 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:48.180225 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tkcmp" event={"ID":"4dd7876b-a6b7-4cf0-b645-979aead5bdff","Type":"ContainerStarted","Data":"720ab0d7cddc7431427d55389c8d4c5fa65f774113365d6f697908ddd7349b35"} Apr 17 11:16:48.180503 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:48.180366 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-tkcmp" Apr 17 11:16:48.181407 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:48.181243 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7lqsd" podStartSLOduration=63.343548849 podStartE2EDuration="1m7.181227801s" podCreationTimestamp="2026-04-17 11:15:41 +0000 UTC" firstStartedPulling="2026-04-17 11:16:42.929243741 +0000 UTC m=+65.795767869" lastFinishedPulling="2026-04-17 11:16:46.766922679 +0000 UTC m=+69.633446821" observedRunningTime="2026-04-17 11:16:48.18027049 +0000 UTC m=+71.046794641" watchObservedRunningTime="2026-04-17 11:16:48.181227801 +0000 UTC m=+71.047751953" Apr 17 11:16:48.191221 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:48.191168 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-654774794f-gl7h8" event={"ID":"76d3f378-0b89-4691-a527-31be2c7922be","Type":"ContainerStarted","Data":"8ac5bdf532d96eadc53aecbd5996d21438fc975fdbb5bce6c123d05da69f4863"} Apr 17 11:16:48.197718 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:48.197662 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pp8hg" event={"ID":"acfe3db2-72d7-40a4-bd0f-d52828be6509","Type":"ContainerStarted","Data":"760bd8bcb6307b7f4e5de19727ca0dde21db4aac72035e19765c025afb78ddc8"} Apr 17 11:16:48.197718 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:48.197696 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pp8hg" event={"ID":"acfe3db2-72d7-40a4-bd0f-d52828be6509","Type":"ContainerStarted","Data":"a5a784de67dcb761b0cfc99a660fbb245fe758a6c46e4aa36b722013eb2d9eae"} Apr 17 11:16:48.208982 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:48.207997 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9g7pq" event={"ID":"0ba74b24-e523-481e-82b5-080dc7ecb2e2","Type":"ContainerStarted","Data":"8d1e63c0b335a0d8d2342817b256f52673c150224eb4de14cb5f090c31516d1f"} Apr 17 11:16:48.208982 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:48.208031 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9g7pq" event={"ID":"0ba74b24-e523-481e-82b5-080dc7ecb2e2","Type":"ContainerStarted","Data":"70fe91154aadfa8006679fdcbe14e8270c93b0adcf97b1edaf5560e2f600f705"} Apr 17 11:16:48.224788 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:48.223596 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-tkcmp" podStartSLOduration=34.681098386 podStartE2EDuration="38.223576601s" podCreationTimestamp="2026-04-17 11:16:10 +0000 UTC" firstStartedPulling="2026-04-17 11:16:43.214851676 +0000 UTC m=+66.081375804" lastFinishedPulling="2026-04-17 11:16:46.757329878 +0000 UTC m=+69.623854019" observedRunningTime="2026-04-17 11:16:48.202124143 +0000 UTC m=+71.068648298" watchObservedRunningTime="2026-04-17 11:16:48.223576601 +0000 UTC m=+71.090100753" Apr 17 11:16:48.272664 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:48.272627 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/ddb1e496-a5cc-4d00-9aef-856c4b501cf7-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-w9zc9\" (UID: \"ddb1e496-a5cc-4d00-9aef-856c4b501cf7\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w9zc9" Apr 17 11:16:48.279684 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:48.279653 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/ddb1e496-a5cc-4d00-9aef-856c4b501cf7-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-w9zc9\" (UID: \"ddb1e496-a5cc-4d00-9aef-856c4b501cf7\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w9zc9" Apr 17 11:16:48.476227 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:48.476194 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w9zc9" Apr 17 11:16:48.727735 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:48.725636 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-9g7pq" podStartSLOduration=68.534647213 podStartE2EDuration="1m11.725614078s" podCreationTimestamp="2026-04-17 11:15:37 +0000 UTC" firstStartedPulling="2026-04-17 11:16:43.621411028 +0000 UTC m=+66.487935163" lastFinishedPulling="2026-04-17 11:16:46.812377893 +0000 UTC m=+69.678902028" observedRunningTime="2026-04-17 11:16:48.225044176 +0000 UTC m=+71.091568324" watchObservedRunningTime="2026-04-17 11:16:48.725614078 +0000 UTC m=+71.592138230" Apr 17 11:16:48.727735 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:48.727639 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w9zc9"] Apr 17 11:16:48.738368 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:16:48.738329 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddb1e496_a5cc_4d00_9aef_856c4b501cf7.slice/crio-4ee87e36502c7b3e10b2b4a4279758dcf3a303a27aba2e803cc5b0ace477103d WatchSource:0}: Error finding container 4ee87e36502c7b3e10b2b4a4279758dcf3a303a27aba2e803cc5b0ace477103d: Status 404 returned error can't find the container with id 4ee87e36502c7b3e10b2b4a4279758dcf3a303a27aba2e803cc5b0ace477103d Apr 17 11:16:49.235599 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:49.235555 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w9zc9" event={"ID":"ddb1e496-a5cc-4d00-9aef-856c4b501cf7","Type":"ContainerStarted","Data":"4ee87e36502c7b3e10b2b4a4279758dcf3a303a27aba2e803cc5b0ace477103d"} Apr 17 11:16:49.262357 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:49.261115 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pp8hg" event={"ID":"acfe3db2-72d7-40a4-bd0f-d52828be6509","Type":"ContainerStarted","Data":"16891dd2afdd9f1fb999576e7bec84c0f6e944454586e4b36708436a9cb544b0"} Apr 17 11:16:50.954067 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:50.953192 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-47nt5" Apr 17 11:16:58.263961 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:58.263907 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-tkcmp" Apr 17 11:16:58.303722 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:58.303657 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77b46b789c-mmfkx" event={"ID":"4c6a4a34-e15a-4316-9c07-e21a8e277aae","Type":"ContainerStarted","Data":"2a3ba37d11e0317382d90a337ef22465dfd2b88824a5f840128ee74380ae11f2"} Apr 17 11:16:58.305707 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:58.305659 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-654774794f-gl7h8" event={"ID":"76d3f378-0b89-4691-a527-31be2c7922be","Type":"ContainerStarted","Data":"0016edb42d47806d7e4f90ae6f2239bd97a345dab6674d42eefa1cbcd91f5ec4"} Apr 17 11:16:58.309029 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:58.308994 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pp8hg" event={"ID":"acfe3db2-72d7-40a4-bd0f-d52828be6509","Type":"ContainerStarted","Data":"c193402aeb06b4b55b64e7ae2d937db1054b88e793ab917351b2799317520db2"} Apr 17 11:16:58.310606 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:58.310573 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-856d477c55-5b49d" event={"ID":"26cf1831-1529-4ba7-acc9-63f6e4e1b700","Type":"ContainerStarted","Data":"e7082eac53108d4f4c7640a7ad6e892d792350c1b82501fee12802467236a3ef"} Apr 17 11:16:58.312170 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:58.312140 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w9zc9" event={"ID":"ddb1e496-a5cc-4d00-9aef-856c4b501cf7","Type":"ContainerStarted","Data":"98d7b54c14219878cd966de247615401839d94fa9a9910e353bff3644eb8fef2"} Apr 17 11:16:58.312579 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:58.312561 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w9zc9" Apr 17 11:16:58.314900 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:58.314877 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8bbc9998b-zxgrx" event={"ID":"186f7966-b1a1-4e40-942f-e9cb89cda1bb","Type":"ContainerStarted","Data":"308ea42b8aaffd4f155216a10211d0ec78e9afd378724332cbf5782235e3a35c"} Apr 17 11:16:58.315156 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:58.315127 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8bbc9998b-zxgrx" Apr 17 11:16:58.316809 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:58.316763 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8bbc9998b-zxgrx" Apr 17 11:16:58.317296 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:58.317256 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57b77fd4b5-89vv8" event={"ID":"316ca251-c7ff-4d30-9524-f9a64784a1ac","Type":"ContainerStarted","Data":"b215e9e316734e45e831dbc99d736c5fb558d8824679cfe50390fc4e1fa3e31e"} Apr 17 11:16:58.318118 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:58.318096 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w9zc9" Apr 17 11:16:58.320954 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:58.320900 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77b46b789c-mmfkx" podStartSLOduration=3.732762879 podStartE2EDuration="14.320885244s" podCreationTimestamp="2026-04-17 11:16:44 +0000 UTC" firstStartedPulling="2026-04-17 11:16:47.150675965 +0000 UTC m=+70.017200099" lastFinishedPulling="2026-04-17 11:16:57.73879833 +0000 UTC m=+80.605322464" observedRunningTime="2026-04-17 11:16:58.318560741 +0000 UTC m=+81.185084891" watchObservedRunningTime="2026-04-17 11:16:58.320885244 +0000 UTC m=+81.187409386" Apr 17 11:16:58.335595 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:58.335529 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-pp8hg" podStartSLOduration=3.930438026 podStartE2EDuration="14.335513375s" podCreationTimestamp="2026-04-17 11:16:44 +0000 UTC" firstStartedPulling="2026-04-17 11:16:47.323311711 +0000 UTC m=+70.189835853" lastFinishedPulling="2026-04-17 11:16:57.72838707 +0000 UTC m=+80.594911202" observedRunningTime="2026-04-17 11:16:58.334364085 +0000 UTC m=+81.200888236" watchObservedRunningTime="2026-04-17 11:16:58.335513375 +0000 UTC m=+81.202037528" Apr 17 11:16:58.351190 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:58.351139 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-57b77fd4b5-89vv8" podStartSLOduration=1.797101417 podStartE2EDuration="12.351126919s" podCreationTimestamp="2026-04-17 11:16:46 +0000 UTC" firstStartedPulling="2026-04-17 11:16:47.174852314 +0000 UTC m=+70.041376459" lastFinishedPulling="2026-04-17 11:16:57.728877826 +0000 UTC m=+80.595401961" observedRunningTime="2026-04-17 11:16:58.350820808 +0000 UTC m=+81.217344952" watchObservedRunningTime="2026-04-17 11:16:58.351126919 +0000 UTC m=+81.217651068" Apr 17 11:16:58.367170 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:58.367098 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8bbc9998b-zxgrx" podStartSLOduration=3.691375544 podStartE2EDuration="14.367080304s" podCreationTimestamp="2026-04-17 11:16:44 +0000 UTC" firstStartedPulling="2026-04-17 11:16:47.085386751 +0000 UTC m=+69.951910891" lastFinishedPulling="2026-04-17 11:16:57.761091509 +0000 UTC m=+80.627615651" observedRunningTime="2026-04-17 11:16:58.365703421 +0000 UTC m=+81.232227570" watchObservedRunningTime="2026-04-17 11:16:58.367080304 +0000 UTC m=+81.233604470" Apr 17 11:16:58.385672 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:58.385048 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-654774794f-gl7h8" podStartSLOduration=4.029343971 podStartE2EDuration="14.385031479s" podCreationTimestamp="2026-04-17 11:16:44 +0000 UTC" firstStartedPulling="2026-04-17 11:16:47.373346257 +0000 UTC m=+70.239870387" lastFinishedPulling="2026-04-17 11:16:57.729033752 +0000 UTC m=+80.595557895" observedRunningTime="2026-04-17 11:16:58.383990693 +0000 UTC m=+81.250514843" watchObservedRunningTime="2026-04-17 11:16:58.385031479 +0000 UTC m=+81.251555629" Apr 17 11:16:58.409086 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:16:58.409037 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w9zc9" podStartSLOduration=2.428617782 podStartE2EDuration="11.409022479s" podCreationTimestamp="2026-04-17 11:16:47 +0000 UTC" firstStartedPulling="2026-04-17 11:16:48.746597965 +0000 UTC m=+71.613122104" lastFinishedPulling="2026-04-17 11:16:57.727002666 +0000 UTC m=+80.593526801" observedRunningTime="2026-04-17 11:16:58.407548891 +0000 UTC m=+81.274073042" watchObservedRunningTime="2026-04-17 11:16:58.409022479 +0000 UTC m=+81.275546629" Apr 17 11:17:02.677582 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:02.677425 2579 patch_prober.go:28] interesting pod/image-registry-799db9ddcb-l92ps container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 11:17:02.677582 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:02.677498 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" podUID="beb961a2-0877-4cd8-be68-062f895cef5d" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 11:17:03.067672 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:03.067588 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-dm5sp"] Apr 17 11:17:03.074068 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:03.073787 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dm5sp" Apr 17 11:17:03.077074 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:03.077047 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 11:17:03.077301 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:03.077285 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 11:17:03.077491 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:03.077474 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 11:17:03.077781 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:03.077636 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-sfwkf\"" Apr 17 11:17:03.077781 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:03.077652 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 11:17:03.121048 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:03.120993 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0db0e0ab-e7ee-4975-8084-8c78f76a35ce-node-exporter-textfile\") pod \"node-exporter-dm5sp\" (UID: \"0db0e0ab-e7ee-4975-8084-8c78f76a35ce\") " pod="openshift-monitoring/node-exporter-dm5sp" Apr 17 11:17:03.121201 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:03.121132 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0db0e0ab-e7ee-4975-8084-8c78f76a35ce-metrics-client-ca\") pod \"node-exporter-dm5sp\" (UID: \"0db0e0ab-e7ee-4975-8084-8c78f76a35ce\") " pod="openshift-monitoring/node-exporter-dm5sp" Apr 17 11:17:03.121260 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:03.121203 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ttnw\" (UniqueName: \"kubernetes.io/projected/0db0e0ab-e7ee-4975-8084-8c78f76a35ce-kube-api-access-4ttnw\") pod \"node-exporter-dm5sp\" (UID: \"0db0e0ab-e7ee-4975-8084-8c78f76a35ce\") " pod="openshift-monitoring/node-exporter-dm5sp" Apr 17 11:17:03.121331 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:03.121263 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0db0e0ab-e7ee-4975-8084-8c78f76a35ce-node-exporter-tls\") pod \"node-exporter-dm5sp\" (UID: \"0db0e0ab-e7ee-4975-8084-8c78f76a35ce\") " pod="openshift-monitoring/node-exporter-dm5sp" Apr 17 11:17:03.121331 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:03.121298 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0db0e0ab-e7ee-4975-8084-8c78f76a35ce-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dm5sp\" (UID: \"0db0e0ab-e7ee-4975-8084-8c78f76a35ce\") " pod="openshift-monitoring/node-exporter-dm5sp" Apr 17 11:17:03.121549 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:03.121463 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0db0e0ab-e7ee-4975-8084-8c78f76a35ce-sys\") pod \"node-exporter-dm5sp\" (UID: \"0db0e0ab-e7ee-4975-8084-8c78f76a35ce\") " pod="openshift-monitoring/node-exporter-dm5sp" Apr 17 11:17:03.121549 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:03.121494 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0db0e0ab-e7ee-4975-8084-8c78f76a35ce-root\") pod \"node-exporter-dm5sp\" (UID: \"0db0e0ab-e7ee-4975-8084-8c78f76a35ce\") " pod="openshift-monitoring/node-exporter-dm5sp" Apr 17 11:17:03.121549 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:03.121518 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0db0e0ab-e7ee-4975-8084-8c78f76a35ce-node-exporter-wtmp\") pod \"node-exporter-dm5sp\" (UID: \"0db0e0ab-e7ee-4975-8084-8c78f76a35ce\") " pod="openshift-monitoring/node-exporter-dm5sp" Apr 17 11:17:03.121697 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:03.121548 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0db0e0ab-e7ee-4975-8084-8c78f76a35ce-node-exporter-accelerators-collector-config\") pod \"node-exporter-dm5sp\" (UID: \"0db0e0ab-e7ee-4975-8084-8c78f76a35ce\") " pod="openshift-monitoring/node-exporter-dm5sp" Apr 17 11:17:03.222394 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:03.222352 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0db0e0ab-e7ee-4975-8084-8c78f76a35ce-node-exporter-textfile\") pod \"node-exporter-dm5sp\" (UID: \"0db0e0ab-e7ee-4975-8084-8c78f76a35ce\") " pod="openshift-monitoring/node-exporter-dm5sp" Apr 17 11:17:03.222578 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:03.222422 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0db0e0ab-e7ee-4975-8084-8c78f76a35ce-metrics-client-ca\") pod \"node-exporter-dm5sp\" (UID: \"0db0e0ab-e7ee-4975-8084-8c78f76a35ce\") " pod="openshift-monitoring/node-exporter-dm5sp" Apr 17 11:17:03.222578 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:03.222454 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4ttnw\" (UniqueName: \"kubernetes.io/projected/0db0e0ab-e7ee-4975-8084-8c78f76a35ce-kube-api-access-4ttnw\") pod \"node-exporter-dm5sp\" (UID: \"0db0e0ab-e7ee-4975-8084-8c78f76a35ce\") " pod="openshift-monitoring/node-exporter-dm5sp" Apr 17 11:17:03.222578 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:03.222491 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0db0e0ab-e7ee-4975-8084-8c78f76a35ce-node-exporter-tls\") pod \"node-exporter-dm5sp\" (UID: \"0db0e0ab-e7ee-4975-8084-8c78f76a35ce\") " pod="openshift-monitoring/node-exporter-dm5sp" Apr 17 11:17:03.222578 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:03.222517 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0db0e0ab-e7ee-4975-8084-8c78f76a35ce-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dm5sp\" (UID: \"0db0e0ab-e7ee-4975-8084-8c78f76a35ce\") " pod="openshift-monitoring/node-exporter-dm5sp" Apr 17 11:17:03.222578 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:03.222576 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0db0e0ab-e7ee-4975-8084-8c78f76a35ce-sys\") pod \"node-exporter-dm5sp\" (UID: \"0db0e0ab-e7ee-4975-8084-8c78f76a35ce\") " pod="openshift-monitoring/node-exporter-dm5sp" Apr 17 11:17:03.222792 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:03.222603 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0db0e0ab-e7ee-4975-8084-8c78f76a35ce-root\") pod \"node-exporter-dm5sp\" (UID: \"0db0e0ab-e7ee-4975-8084-8c78f76a35ce\") " pod="openshift-monitoring/node-exporter-dm5sp" Apr 17 11:17:03.222792 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:03.222627 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0db0e0ab-e7ee-4975-8084-8c78f76a35ce-node-exporter-wtmp\") pod \"node-exporter-dm5sp\" (UID: \"0db0e0ab-e7ee-4975-8084-8c78f76a35ce\") " pod="openshift-monitoring/node-exporter-dm5sp" Apr 17 11:17:03.222792 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:03.222665 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0db0e0ab-e7ee-4975-8084-8c78f76a35ce-node-exporter-accelerators-collector-config\") pod \"node-exporter-dm5sp\" (UID: \"0db0e0ab-e7ee-4975-8084-8c78f76a35ce\") " pod="openshift-monitoring/node-exporter-dm5sp" Apr 17 11:17:03.222792 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:03.222719 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0db0e0ab-e7ee-4975-8084-8c78f76a35ce-node-exporter-textfile\") pod \"node-exporter-dm5sp\" (UID: \"0db0e0ab-e7ee-4975-8084-8c78f76a35ce\") " pod="openshift-monitoring/node-exporter-dm5sp" Apr 17 11:17:03.222918 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:03.222815 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0db0e0ab-e7ee-4975-8084-8c78f76a35ce-sys\") pod \"node-exporter-dm5sp\" (UID: \"0db0e0ab-e7ee-4975-8084-8c78f76a35ce\") " pod="openshift-monitoring/node-exporter-dm5sp" Apr 17 11:17:03.222964 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:03.222943 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0db0e0ab-e7ee-4975-8084-8c78f76a35ce-root\") pod \"node-exporter-dm5sp\" (UID: \"0db0e0ab-e7ee-4975-8084-8c78f76a35ce\") " pod="openshift-monitoring/node-exporter-dm5sp" Apr 17 11:17:03.223269 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:03.223251 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0db0e0ab-e7ee-4975-8084-8c78f76a35ce-node-exporter-accelerators-collector-config\") pod \"node-exporter-dm5sp\" (UID: \"0db0e0ab-e7ee-4975-8084-8c78f76a35ce\") " pod="openshift-monitoring/node-exporter-dm5sp" Apr 17 11:17:03.223340 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:03.223279 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0db0e0ab-e7ee-4975-8084-8c78f76a35ce-node-exporter-wtmp\") pod \"node-exporter-dm5sp\" (UID: \"0db0e0ab-e7ee-4975-8084-8c78f76a35ce\") " pod="openshift-monitoring/node-exporter-dm5sp" Apr 17 11:17:03.223702 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:03.223677 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0db0e0ab-e7ee-4975-8084-8c78f76a35ce-metrics-client-ca\") pod \"node-exporter-dm5sp\" (UID: \"0db0e0ab-e7ee-4975-8084-8c78f76a35ce\") " pod="openshift-monitoring/node-exporter-dm5sp" Apr 17 11:17:03.225664 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:03.225640 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0db0e0ab-e7ee-4975-8084-8c78f76a35ce-node-exporter-tls\") pod \"node-exporter-dm5sp\" (UID: \"0db0e0ab-e7ee-4975-8084-8c78f76a35ce\") " pod="openshift-monitoring/node-exporter-dm5sp" Apr 17 11:17:03.225761 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:03.225739 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0db0e0ab-e7ee-4975-8084-8c78f76a35ce-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dm5sp\" (UID: \"0db0e0ab-e7ee-4975-8084-8c78f76a35ce\") " pod="openshift-monitoring/node-exporter-dm5sp" Apr 17 11:17:03.229853 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:03.229807 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ttnw\" (UniqueName: \"kubernetes.io/projected/0db0e0ab-e7ee-4975-8084-8c78f76a35ce-kube-api-access-4ttnw\") pod \"node-exporter-dm5sp\" (UID: \"0db0e0ab-e7ee-4975-8084-8c78f76a35ce\") " pod="openshift-monitoring/node-exporter-dm5sp" Apr 17 11:17:03.386475 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:03.386391 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dm5sp" Apr 17 11:17:05.076069 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:05.076032 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5b9d896675-7lskv" Apr 17 11:17:05.098428 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:05.098389 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-654774794f-gl7h8" Apr 17 11:17:05.098855 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:05.098811 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-654774794f-gl7h8" Apr 17 11:17:05.104439 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:05.104414 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-654774794f-gl7h8" Apr 17 11:17:05.351029 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:05.350953 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-654774794f-gl7h8" Apr 17 11:17:06.079629 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:06.079599 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" Apr 17 11:17:06.667822 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:06.667757 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-57b77fd4b5-89vv8" Apr 17 11:17:06.667993 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:06.667839 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-57b77fd4b5-89vv8" Apr 17 11:17:06.672904 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:06.672880 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-57b77fd4b5-89vv8" Apr 17 11:17:07.357716 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:07.357689 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-57b77fd4b5-89vv8" Apr 17 11:17:07.406699 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:07.406660 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-654774794f-gl7h8"] Apr 17 11:17:08.361083 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:08.361016 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dm5sp" event={"ID":"0db0e0ab-e7ee-4975-8084-8c78f76a35ce","Type":"ContainerStarted","Data":"52161f319574a573c9dbcdd62eb7c5f005b750f660da902fc2243ebdc964a11f"} Apr 17 11:17:08.363518 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:08.363479 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-856d477c55-5b49d" event={"ID":"26cf1831-1529-4ba7-acc9-63f6e4e1b700","Type":"ContainerStarted","Data":"e1a90e07cd976e99b8dc754c7ee0406c75eb0c0bf9691742d78697450dd331a0"} Apr 17 11:17:08.363665 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:08.363528 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-856d477c55-5b49d" event={"ID":"26cf1831-1529-4ba7-acc9-63f6e4e1b700","Type":"ContainerStarted","Data":"9d3a3003a40662cee16ee710e9b6620fab37f841f62e5ede7b3df43768ebb5fc"} Apr 17 11:17:08.366109 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:08.366084 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-cqq7h" event={"ID":"9473d957-8483-46e1-86cf-c51754d8c054","Type":"ContainerStarted","Data":"5b9db7298d5ba917fd69d3e3d059f73047f488d393c88eae0c4ebe08a89eeab7"} Apr 17 11:17:08.366345 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:08.366277 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-cqq7h" Apr 17 11:17:08.378475 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:08.378448 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-cqq7h" Apr 17 11:17:08.381224 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:08.381171 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-856d477c55-5b49d" podStartSLOduration=3.838764819 podStartE2EDuration="24.381154118s" podCreationTimestamp="2026-04-17 11:16:44 +0000 UTC" firstStartedPulling="2026-04-17 11:16:47.107945707 +0000 UTC m=+69.974469837" lastFinishedPulling="2026-04-17 11:17:07.650334992 +0000 UTC m=+90.516859136" observedRunningTime="2026-04-17 11:17:08.380416922 +0000 UTC m=+91.246941076" watchObservedRunningTime="2026-04-17 11:17:08.381154118 +0000 UTC m=+91.247678270" Apr 17 11:17:08.401759 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:08.401701 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-cqq7h" podStartSLOduration=4.121681659 podStartE2EDuration="24.401683436s" podCreationTimestamp="2026-04-17 11:16:44 +0000 UTC" firstStartedPulling="2026-04-17 11:16:47.377798942 +0000 UTC m=+70.244323076" lastFinishedPulling="2026-04-17 11:17:07.657800725 +0000 UTC m=+90.524324853" observedRunningTime="2026-04-17 11:17:08.40029744 +0000 UTC m=+91.266821593" watchObservedRunningTime="2026-04-17 11:17:08.401683436 +0000 UTC m=+91.268207587" Apr 17 11:17:09.370719 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:09.370680 2579 generic.go:358] "Generic (PLEG): container finished" podID="0db0e0ab-e7ee-4975-8084-8c78f76a35ce" containerID="24941c8696dde73a6537e037b7b1f6c1c9536aa0c97d1b3136e162c99e9ec46a" exitCode=0 Apr 17 11:17:09.371180 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:09.370803 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dm5sp" event={"ID":"0db0e0ab-e7ee-4975-8084-8c78f76a35ce","Type":"ContainerDied","Data":"24941c8696dde73a6537e037b7b1f6c1c9536aa0c97d1b3136e162c99e9ec46a"} Apr 17 11:17:10.088153 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:10.087939 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5b9d896675-7lskv" podUID="97fe64f0-9f87-4b25-876e-a59829b69c04" containerName="registry" containerID="cri-o://b8dac2aee625391c5c735c7a4f28bda17a879840dd45a73f8b9bb470668228aa" gracePeriod=30 Apr 17 11:17:10.375571 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:10.375542 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5b9d896675-7lskv" Apr 17 11:17:10.377446 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:10.377411 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dm5sp" event={"ID":"0db0e0ab-e7ee-4975-8084-8c78f76a35ce","Type":"ContainerStarted","Data":"4151cc2f26feb9bc158183dec62d83ae4f6b4e80603d4190cc1e763ec0a08d80"} Apr 17 11:17:10.377587 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:10.377457 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dm5sp" event={"ID":"0db0e0ab-e7ee-4975-8084-8c78f76a35ce","Type":"ContainerStarted","Data":"8f75ad31abd4ca29977aaed46adfd8c4fd7ca35c0a2dc85b08302436fe242241"} Apr 17 11:17:10.379376 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:10.378761 2579 generic.go:358] "Generic (PLEG): container finished" podID="97fe64f0-9f87-4b25-876e-a59829b69c04" containerID="b8dac2aee625391c5c735c7a4f28bda17a879840dd45a73f8b9bb470668228aa" exitCode=0 Apr 17 11:17:10.379376 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:10.378840 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5b9d896675-7lskv" Apr 17 11:17:10.379376 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:10.378849 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5b9d896675-7lskv" event={"ID":"97fe64f0-9f87-4b25-876e-a59829b69c04","Type":"ContainerDied","Data":"b8dac2aee625391c5c735c7a4f28bda17a879840dd45a73f8b9bb470668228aa"} Apr 17 11:17:10.379376 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:10.378880 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5b9d896675-7lskv" event={"ID":"97fe64f0-9f87-4b25-876e-a59829b69c04","Type":"ContainerDied","Data":"5501beda713b23d94077ac80ee75c293d1122a812513488698c10c01e0ede782"} Apr 17 11:17:10.379376 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:10.378896 2579 scope.go:117] "RemoveContainer" containerID="b8dac2aee625391c5c735c7a4f28bda17a879840dd45a73f8b9bb470668228aa" Apr 17 11:17:10.389267 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:10.389240 2579 scope.go:117] "RemoveContainer" containerID="b8dac2aee625391c5c735c7a4f28bda17a879840dd45a73f8b9bb470668228aa" Apr 17 11:17:10.389620 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:17:10.389595 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8dac2aee625391c5c735c7a4f28bda17a879840dd45a73f8b9bb470668228aa\": container with ID starting with b8dac2aee625391c5c735c7a4f28bda17a879840dd45a73f8b9bb470668228aa not found: ID does not exist" containerID="b8dac2aee625391c5c735c7a4f28bda17a879840dd45a73f8b9bb470668228aa" Apr 17 11:17:10.389720 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:10.389629 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8dac2aee625391c5c735c7a4f28bda17a879840dd45a73f8b9bb470668228aa"} err="failed to get container status \"b8dac2aee625391c5c735c7a4f28bda17a879840dd45a73f8b9bb470668228aa\": rpc error: code = NotFound desc = could not find container \"b8dac2aee625391c5c735c7a4f28bda17a879840dd45a73f8b9bb470668228aa\": container with ID starting with b8dac2aee625391c5c735c7a4f28bda17a879840dd45a73f8b9bb470668228aa not found: ID does not exist" Apr 17 11:17:10.416951 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:10.416886 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-dm5sp" podStartSLOduration=6.544146382 podStartE2EDuration="7.416866664s" podCreationTimestamp="2026-04-17 11:17:03 +0000 UTC" firstStartedPulling="2026-04-17 11:17:07.649973451 +0000 UTC m=+90.516497785" lastFinishedPulling="2026-04-17 11:17:08.522693931 +0000 UTC m=+91.389218067" observedRunningTime="2026-04-17 11:17:10.415629469 +0000 UTC m=+93.282153618" watchObservedRunningTime="2026-04-17 11:17:10.416866664 +0000 UTC m=+93.283390815" Apr 17 11:17:10.495620 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:10.495141 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/97fe64f0-9f87-4b25-876e-a59829b69c04-ca-trust-extracted\") pod \"97fe64f0-9f87-4b25-876e-a59829b69c04\" (UID: \"97fe64f0-9f87-4b25-876e-a59829b69c04\") " Apr 17 11:17:10.495620 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:10.495189 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97fe64f0-9f87-4b25-876e-a59829b69c04-trusted-ca\") pod \"97fe64f0-9f87-4b25-876e-a59829b69c04\" (UID: \"97fe64f0-9f87-4b25-876e-a59829b69c04\") " Apr 17 11:17:10.495620 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:10.495258 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97fe64f0-9f87-4b25-876e-a59829b69c04-registry-tls\") pod \"97fe64f0-9f87-4b25-876e-a59829b69c04\" (UID: \"97fe64f0-9f87-4b25-876e-a59829b69c04\") " Apr 17 11:17:10.495620 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:10.495289 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/97fe64f0-9f87-4b25-876e-a59829b69c04-image-registry-private-configuration\") pod \"97fe64f0-9f87-4b25-876e-a59829b69c04\" (UID: \"97fe64f0-9f87-4b25-876e-a59829b69c04\") " Apr 17 11:17:10.495620 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:10.495357 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2dxm\" (UniqueName: \"kubernetes.io/projected/97fe64f0-9f87-4b25-876e-a59829b69c04-kube-api-access-r2dxm\") pod \"97fe64f0-9f87-4b25-876e-a59829b69c04\" (UID: \"97fe64f0-9f87-4b25-876e-a59829b69c04\") " Apr 17 11:17:10.495620 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:10.495388 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/97fe64f0-9f87-4b25-876e-a59829b69c04-registry-certificates\") pod \"97fe64f0-9f87-4b25-876e-a59829b69c04\" (UID: \"97fe64f0-9f87-4b25-876e-a59829b69c04\") " Apr 17 11:17:10.495620 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:10.495418 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/97fe64f0-9f87-4b25-876e-a59829b69c04-installation-pull-secrets\") pod \"97fe64f0-9f87-4b25-876e-a59829b69c04\" (UID: \"97fe64f0-9f87-4b25-876e-a59829b69c04\") " Apr 17 11:17:10.495620 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:10.495460 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97fe64f0-9f87-4b25-876e-a59829b69c04-bound-sa-token\") pod \"97fe64f0-9f87-4b25-876e-a59829b69c04\" (UID: \"97fe64f0-9f87-4b25-876e-a59829b69c04\") " Apr 17 11:17:10.496346 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:10.496314 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97fe64f0-9f87-4b25-876e-a59829b69c04-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "97fe64f0-9f87-4b25-876e-a59829b69c04" (UID: "97fe64f0-9f87-4b25-876e-a59829b69c04"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:17:10.496946 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:10.496913 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97fe64f0-9f87-4b25-876e-a59829b69c04-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "97fe64f0-9f87-4b25-876e-a59829b69c04" (UID: "97fe64f0-9f87-4b25-876e-a59829b69c04"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:17:10.498702 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:10.498678 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97fe64f0-9f87-4b25-876e-a59829b69c04-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "97fe64f0-9f87-4b25-876e-a59829b69c04" (UID: "97fe64f0-9f87-4b25-876e-a59829b69c04"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:17:10.498877 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:10.498731 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97fe64f0-9f87-4b25-876e-a59829b69c04-kube-api-access-r2dxm" (OuterVolumeSpecName: "kube-api-access-r2dxm") pod "97fe64f0-9f87-4b25-876e-a59829b69c04" (UID: "97fe64f0-9f87-4b25-876e-a59829b69c04"). InnerVolumeSpecName "kube-api-access-r2dxm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:17:10.499016 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:10.498760 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97fe64f0-9f87-4b25-876e-a59829b69c04-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "97fe64f0-9f87-4b25-876e-a59829b69c04" (UID: "97fe64f0-9f87-4b25-876e-a59829b69c04"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:17:10.499127 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:10.499105 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97fe64f0-9f87-4b25-876e-a59829b69c04-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "97fe64f0-9f87-4b25-876e-a59829b69c04" (UID: "97fe64f0-9f87-4b25-876e-a59829b69c04"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:17:10.499387 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:10.499363 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97fe64f0-9f87-4b25-876e-a59829b69c04-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "97fe64f0-9f87-4b25-876e-a59829b69c04" (UID: "97fe64f0-9f87-4b25-876e-a59829b69c04"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:17:10.506185 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:10.506156 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97fe64f0-9f87-4b25-876e-a59829b69c04-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "97fe64f0-9f87-4b25-876e-a59829b69c04" (UID: "97fe64f0-9f87-4b25-876e-a59829b69c04"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:17:10.596740 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:10.596704 2579 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97fe64f0-9f87-4b25-876e-a59829b69c04-registry-tls\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Apr 17 11:17:10.596740 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:10.596739 2579 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/97fe64f0-9f87-4b25-876e-a59829b69c04-image-registry-private-configuration\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Apr 17 11:17:10.596968 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:10.596753 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r2dxm\" (UniqueName: \"kubernetes.io/projected/97fe64f0-9f87-4b25-876e-a59829b69c04-kube-api-access-r2dxm\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Apr 17 11:17:10.596968 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:10.596786 2579 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/97fe64f0-9f87-4b25-876e-a59829b69c04-registry-certificates\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Apr 17 11:17:10.596968 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:10.596803 2579 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/97fe64f0-9f87-4b25-876e-a59829b69c04-installation-pull-secrets\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Apr 17 11:17:10.596968 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:10.596814 2579 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97fe64f0-9f87-4b25-876e-a59829b69c04-bound-sa-token\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Apr 17 11:17:10.596968 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:10.596829 2579 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/97fe64f0-9f87-4b25-876e-a59829b69c04-ca-trust-extracted\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Apr 17 11:17:10.596968 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:10.596842 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97fe64f0-9f87-4b25-876e-a59829b69c04-trusted-ca\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Apr 17 11:17:10.703006 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:10.702977 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5b9d896675-7lskv"] Apr 17 11:17:10.706712 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:10.706674 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5b9d896675-7lskv"] Apr 17 11:17:11.705413 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:11.705371 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97fe64f0-9f87-4b25-876e-a59829b69c04" path="/var/lib/kubelet/pods/97fe64f0-9f87-4b25-876e-a59829b69c04/volumes" Apr 17 11:17:12.094406 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:12.094320 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-799db9ddcb-l92ps"] Apr 17 11:17:14.941574 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:14.941535 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-67cbf7fb8c-9zs8r"] Apr 17 11:17:14.942089 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:14.941958 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97fe64f0-9f87-4b25-876e-a59829b69c04" containerName="registry" Apr 17 11:17:14.942089 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:14.941978 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="97fe64f0-9f87-4b25-876e-a59829b69c04" containerName="registry" Apr 17 11:17:14.942089 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:14.942045 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="97fe64f0-9f87-4b25-876e-a59829b69c04" containerName="registry" Apr 17 11:17:15.011621 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:15.011593 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67cbf7fb8c-9zs8r"] Apr 17 11:17:15.011814 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:15.011721 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67cbf7fb8c-9zs8r" Apr 17 11:17:15.137921 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:15.137887 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgddd\" (UniqueName: \"kubernetes.io/projected/6ecada3a-57c6-492e-b4b2-8e738576c1c4-kube-api-access-jgddd\") pod \"console-67cbf7fb8c-9zs8r\" (UID: \"6ecada3a-57c6-492e-b4b2-8e738576c1c4\") " pod="openshift-console/console-67cbf7fb8c-9zs8r" Apr 17 11:17:15.138121 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:15.137940 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6ecada3a-57c6-492e-b4b2-8e738576c1c4-service-ca\") pod \"console-67cbf7fb8c-9zs8r\" (UID: \"6ecada3a-57c6-492e-b4b2-8e738576c1c4\") " pod="openshift-console/console-67cbf7fb8c-9zs8r" Apr 17 11:17:15.138121 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:15.137995 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6ecada3a-57c6-492e-b4b2-8e738576c1c4-console-oauth-config\") pod \"console-67cbf7fb8c-9zs8r\" (UID: \"6ecada3a-57c6-492e-b4b2-8e738576c1c4\") " pod="openshift-console/console-67cbf7fb8c-9zs8r" Apr 17 11:17:15.138121 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:15.138057 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6ecada3a-57c6-492e-b4b2-8e738576c1c4-oauth-serving-cert\") pod \"console-67cbf7fb8c-9zs8r\" (UID: \"6ecada3a-57c6-492e-b4b2-8e738576c1c4\") " pod="openshift-console/console-67cbf7fb8c-9zs8r" Apr 17 11:17:15.138121 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:15.138111 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ecada3a-57c6-492e-b4b2-8e738576c1c4-trusted-ca-bundle\") pod \"console-67cbf7fb8c-9zs8r\" (UID: \"6ecada3a-57c6-492e-b4b2-8e738576c1c4\") " pod="openshift-console/console-67cbf7fb8c-9zs8r" Apr 17 11:17:15.138337 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:15.138209 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6ecada3a-57c6-492e-b4b2-8e738576c1c4-console-serving-cert\") pod \"console-67cbf7fb8c-9zs8r\" (UID: \"6ecada3a-57c6-492e-b4b2-8e738576c1c4\") " pod="openshift-console/console-67cbf7fb8c-9zs8r" Apr 17 11:17:15.138337 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:15.138247 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6ecada3a-57c6-492e-b4b2-8e738576c1c4-console-config\") pod \"console-67cbf7fb8c-9zs8r\" (UID: \"6ecada3a-57c6-492e-b4b2-8e738576c1c4\") " pod="openshift-console/console-67cbf7fb8c-9zs8r" Apr 17 11:17:15.239116 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:15.239028 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6ecada3a-57c6-492e-b4b2-8e738576c1c4-console-serving-cert\") pod \"console-67cbf7fb8c-9zs8r\" (UID: \"6ecada3a-57c6-492e-b4b2-8e738576c1c4\") " pod="openshift-console/console-67cbf7fb8c-9zs8r" Apr 17 11:17:15.239116 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:15.239081 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6ecada3a-57c6-492e-b4b2-8e738576c1c4-console-config\") pod \"console-67cbf7fb8c-9zs8r\" (UID: \"6ecada3a-57c6-492e-b4b2-8e738576c1c4\") " pod="openshift-console/console-67cbf7fb8c-9zs8r" Apr 17 11:17:15.239299 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:15.239131 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jgddd\" (UniqueName: \"kubernetes.io/projected/6ecada3a-57c6-492e-b4b2-8e738576c1c4-kube-api-access-jgddd\") pod \"console-67cbf7fb8c-9zs8r\" (UID: \"6ecada3a-57c6-492e-b4b2-8e738576c1c4\") " pod="openshift-console/console-67cbf7fb8c-9zs8r" Apr 17 11:17:15.239299 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:15.239163 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6ecada3a-57c6-492e-b4b2-8e738576c1c4-service-ca\") pod \"console-67cbf7fb8c-9zs8r\" (UID: \"6ecada3a-57c6-492e-b4b2-8e738576c1c4\") " pod="openshift-console/console-67cbf7fb8c-9zs8r" Apr 17 11:17:15.239299 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:15.239189 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6ecada3a-57c6-492e-b4b2-8e738576c1c4-console-oauth-config\") pod \"console-67cbf7fb8c-9zs8r\" (UID: \"6ecada3a-57c6-492e-b4b2-8e738576c1c4\") " pod="openshift-console/console-67cbf7fb8c-9zs8r" Apr 17 11:17:15.239299 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:15.239216 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6ecada3a-57c6-492e-b4b2-8e738576c1c4-oauth-serving-cert\") pod \"console-67cbf7fb8c-9zs8r\" (UID: \"6ecada3a-57c6-492e-b4b2-8e738576c1c4\") " pod="openshift-console/console-67cbf7fb8c-9zs8r" Apr 17 11:17:15.239299 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:15.239263 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ecada3a-57c6-492e-b4b2-8e738576c1c4-trusted-ca-bundle\") pod \"console-67cbf7fb8c-9zs8r\" (UID: \"6ecada3a-57c6-492e-b4b2-8e738576c1c4\") " pod="openshift-console/console-67cbf7fb8c-9zs8r" Apr 17 11:17:15.240023 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:15.239991 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6ecada3a-57c6-492e-b4b2-8e738576c1c4-service-ca\") pod \"console-67cbf7fb8c-9zs8r\" (UID: \"6ecada3a-57c6-492e-b4b2-8e738576c1c4\") " pod="openshift-console/console-67cbf7fb8c-9zs8r" Apr 17 11:17:15.240023 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:15.240018 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ecada3a-57c6-492e-b4b2-8e738576c1c4-trusted-ca-bundle\") pod \"console-67cbf7fb8c-9zs8r\" (UID: \"6ecada3a-57c6-492e-b4b2-8e738576c1c4\") " pod="openshift-console/console-67cbf7fb8c-9zs8r" Apr 17 11:17:15.240490 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:15.240469 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6ecada3a-57c6-492e-b4b2-8e738576c1c4-oauth-serving-cert\") pod \"console-67cbf7fb8c-9zs8r\" (UID: \"6ecada3a-57c6-492e-b4b2-8e738576c1c4\") " pod="openshift-console/console-67cbf7fb8c-9zs8r" Apr 17 11:17:15.240593 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:15.240533 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6ecada3a-57c6-492e-b4b2-8e738576c1c4-console-config\") pod \"console-67cbf7fb8c-9zs8r\" (UID: \"6ecada3a-57c6-492e-b4b2-8e738576c1c4\") " pod="openshift-console/console-67cbf7fb8c-9zs8r" Apr 17 11:17:15.241945 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:15.241927 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6ecada3a-57c6-492e-b4b2-8e738576c1c4-console-serving-cert\") pod \"console-67cbf7fb8c-9zs8r\" (UID: \"6ecada3a-57c6-492e-b4b2-8e738576c1c4\") " pod="openshift-console/console-67cbf7fb8c-9zs8r" Apr 17 11:17:15.242672 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:15.242648 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6ecada3a-57c6-492e-b4b2-8e738576c1c4-console-oauth-config\") pod \"console-67cbf7fb8c-9zs8r\" (UID: \"6ecada3a-57c6-492e-b4b2-8e738576c1c4\") " pod="openshift-console/console-67cbf7fb8c-9zs8r" Apr 17 11:17:15.248442 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:15.248416 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgddd\" (UniqueName: \"kubernetes.io/projected/6ecada3a-57c6-492e-b4b2-8e738576c1c4-kube-api-access-jgddd\") pod \"console-67cbf7fb8c-9zs8r\" (UID: \"6ecada3a-57c6-492e-b4b2-8e738576c1c4\") " pod="openshift-console/console-67cbf7fb8c-9zs8r" Apr 17 11:17:15.322389 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:15.322351 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67cbf7fb8c-9zs8r" Apr 17 11:17:15.473512 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:15.473480 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67cbf7fb8c-9zs8r"] Apr 17 11:17:15.476285 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:17:15.476253 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ecada3a_57c6_492e_b4b2_8e738576c1c4.slice/crio-3f25b90a2068f28cc9401bda915de4c183d830d17f6791e5065f7ea9b48915ed WatchSource:0}: Error finding container 3f25b90a2068f28cc9401bda915de4c183d830d17f6791e5065f7ea9b48915ed: Status 404 returned error can't find the container with id 3f25b90a2068f28cc9401bda915de4c183d830d17f6791e5065f7ea9b48915ed Apr 17 11:17:16.406573 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:16.406531 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67cbf7fb8c-9zs8r" event={"ID":"6ecada3a-57c6-492e-b4b2-8e738576c1c4","Type":"ContainerStarted","Data":"f6788711607b3c67fb35551fe87b6634b67f4d5a4490025807fe611acc351241"} Apr 17 11:17:16.406573 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:16.406578 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67cbf7fb8c-9zs8r" event={"ID":"6ecada3a-57c6-492e-b4b2-8e738576c1c4","Type":"ContainerStarted","Data":"3f25b90a2068f28cc9401bda915de4c183d830d17f6791e5065f7ea9b48915ed"} Apr 17 11:17:16.424422 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:16.424367 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-67cbf7fb8c-9zs8r" podStartSLOduration=2.424349387 podStartE2EDuration="2.424349387s" podCreationTimestamp="2026-04-17 11:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:17:16.423232628 +0000 UTC m=+99.289756778" watchObservedRunningTime="2026-04-17 11:17:16.424349387 +0000 UTC m=+99.290873538" Apr 17 11:17:25.322458 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:25.322422 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-67cbf7fb8c-9zs8r" Apr 17 11:17:25.322458 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:25.322471 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-67cbf7fb8c-9zs8r" Apr 17 11:17:25.327271 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:25.327252 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-67cbf7fb8c-9zs8r" Apr 17 11:17:25.440540 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:25.440514 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-67cbf7fb8c-9zs8r" Apr 17 11:17:25.492148 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:25.492113 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-57b77fd4b5-89vv8"] Apr 17 11:17:26.441832 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:26.441795 2579 generic.go:358] "Generic (PLEG): container finished" podID="44d4ecc4-97ed-4995-a7f7-5f731f3fe770" containerID="e933106ac24b6a4f0acbd9e78c511e7f4a936b508be81cd6ae14e5012263f6d0" exitCode=0 Apr 17 11:17:26.442312 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:26.441868 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ds9gs" event={"ID":"44d4ecc4-97ed-4995-a7f7-5f731f3fe770","Type":"ContainerDied","Data":"e933106ac24b6a4f0acbd9e78c511e7f4a936b508be81cd6ae14e5012263f6d0"} Apr 17 11:17:26.442312 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:26.442172 2579 scope.go:117] "RemoveContainer" containerID="e933106ac24b6a4f0acbd9e78c511e7f4a936b508be81cd6ae14e5012263f6d0" Apr 17 11:17:27.447711 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:27.447674 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ds9gs" event={"ID":"44d4ecc4-97ed-4995-a7f7-5f731f3fe770","Type":"ContainerStarted","Data":"b321319bea4afbb22c5e475566e0e152948e05a75a92b0aba2d1a311a54d19af"} Apr 17 11:17:33.389937 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:33.389869 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-654774794f-gl7h8" podUID="76d3f378-0b89-4691-a527-31be2c7922be" containerName="console" containerID="cri-o://0016edb42d47806d7e4f90ae6f2239bd97a345dab6674d42eefa1cbcd91f5ec4" gracePeriod=15 Apr 17 11:17:33.671978 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:33.671952 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-654774794f-gl7h8_76d3f378-0b89-4691-a527-31be2c7922be/console/0.log" Apr 17 11:17:33.672119 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:33.672014 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-654774794f-gl7h8" Apr 17 11:17:33.811465 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:33.811426 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/76d3f378-0b89-4691-a527-31be2c7922be-console-config\") pod \"76d3f378-0b89-4691-a527-31be2c7922be\" (UID: \"76d3f378-0b89-4691-a527-31be2c7922be\") " Apr 17 11:17:33.811652 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:33.811538 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/76d3f378-0b89-4691-a527-31be2c7922be-service-ca\") pod \"76d3f378-0b89-4691-a527-31be2c7922be\" (UID: \"76d3f378-0b89-4691-a527-31be2c7922be\") " Apr 17 11:17:33.811652 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:33.811578 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/76d3f378-0b89-4691-a527-31be2c7922be-oauth-serving-cert\") pod \"76d3f378-0b89-4691-a527-31be2c7922be\" (UID: \"76d3f378-0b89-4691-a527-31be2c7922be\") " Apr 17 11:17:33.811652 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:33.811601 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/76d3f378-0b89-4691-a527-31be2c7922be-console-oauth-config\") pod \"76d3f378-0b89-4691-a527-31be2c7922be\" (UID: \"76d3f378-0b89-4691-a527-31be2c7922be\") " Apr 17 11:17:33.811652 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:33.811628 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qk4fh\" (UniqueName: \"kubernetes.io/projected/76d3f378-0b89-4691-a527-31be2c7922be-kube-api-access-qk4fh\") pod \"76d3f378-0b89-4691-a527-31be2c7922be\" (UID: \"76d3f378-0b89-4691-a527-31be2c7922be\") " Apr 17 11:17:33.811899 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:33.811661 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/76d3f378-0b89-4691-a527-31be2c7922be-console-serving-cert\") pod \"76d3f378-0b89-4691-a527-31be2c7922be\" (UID: \"76d3f378-0b89-4691-a527-31be2c7922be\") " Apr 17 11:17:33.811899 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:33.811723 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76d3f378-0b89-4691-a527-31be2c7922be-console-config" (OuterVolumeSpecName: "console-config") pod "76d3f378-0b89-4691-a527-31be2c7922be" (UID: "76d3f378-0b89-4691-a527-31be2c7922be"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:17:33.812066 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:33.812004 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76d3f378-0b89-4691-a527-31be2c7922be-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "76d3f378-0b89-4691-a527-31be2c7922be" (UID: "76d3f378-0b89-4691-a527-31be2c7922be"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:17:33.812066 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:33.812020 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76d3f378-0b89-4691-a527-31be2c7922be-service-ca" (OuterVolumeSpecName: "service-ca") pod "76d3f378-0b89-4691-a527-31be2c7922be" (UID: "76d3f378-0b89-4691-a527-31be2c7922be"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:17:33.812066 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:33.812033 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/76d3f378-0b89-4691-a527-31be2c7922be-console-config\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Apr 17 11:17:33.813948 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:33.813925 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76d3f378-0b89-4691-a527-31be2c7922be-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "76d3f378-0b89-4691-a527-31be2c7922be" (UID: "76d3f378-0b89-4691-a527-31be2c7922be"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:17:33.814077 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:33.814057 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76d3f378-0b89-4691-a527-31be2c7922be-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "76d3f378-0b89-4691-a527-31be2c7922be" (UID: "76d3f378-0b89-4691-a527-31be2c7922be"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:17:33.814379 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:33.814364 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76d3f378-0b89-4691-a527-31be2c7922be-kube-api-access-qk4fh" (OuterVolumeSpecName: "kube-api-access-qk4fh") pod "76d3f378-0b89-4691-a527-31be2c7922be" (UID: "76d3f378-0b89-4691-a527-31be2c7922be"). InnerVolumeSpecName "kube-api-access-qk4fh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:17:33.912714 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:33.912606 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/76d3f378-0b89-4691-a527-31be2c7922be-service-ca\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Apr 17 11:17:33.912714 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:33.912653 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/76d3f378-0b89-4691-a527-31be2c7922be-oauth-serving-cert\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Apr 17 11:17:33.912714 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:33.912669 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/76d3f378-0b89-4691-a527-31be2c7922be-console-oauth-config\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Apr 17 11:17:33.912714 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:33.912681 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qk4fh\" (UniqueName: \"kubernetes.io/projected/76d3f378-0b89-4691-a527-31be2c7922be-kube-api-access-qk4fh\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Apr 17 11:17:33.912714 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:33.912693 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/76d3f378-0b89-4691-a527-31be2c7922be-console-serving-cert\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Apr 17 11:17:34.471384 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:34.471353 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-654774794f-gl7h8_76d3f378-0b89-4691-a527-31be2c7922be/console/0.log" Apr 17 11:17:34.471901 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:34.471402 2579 generic.go:358] "Generic (PLEG): container finished" podID="76d3f378-0b89-4691-a527-31be2c7922be" containerID="0016edb42d47806d7e4f90ae6f2239bd97a345dab6674d42eefa1cbcd91f5ec4" exitCode=2 Apr 17 11:17:34.471901 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:34.471484 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-654774794f-gl7h8" Apr 17 11:17:34.471901 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:34.471489 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-654774794f-gl7h8" event={"ID":"76d3f378-0b89-4691-a527-31be2c7922be","Type":"ContainerDied","Data":"0016edb42d47806d7e4f90ae6f2239bd97a345dab6674d42eefa1cbcd91f5ec4"} Apr 17 11:17:34.471901 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:34.471540 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-654774794f-gl7h8" event={"ID":"76d3f378-0b89-4691-a527-31be2c7922be","Type":"ContainerDied","Data":"8ac5bdf532d96eadc53aecbd5996d21438fc975fdbb5bce6c123d05da69f4863"} Apr 17 11:17:34.471901 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:34.471558 2579 scope.go:117] "RemoveContainer" containerID="0016edb42d47806d7e4f90ae6f2239bd97a345dab6674d42eefa1cbcd91f5ec4" Apr 17 11:17:34.485536 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:34.485515 2579 scope.go:117] "RemoveContainer" containerID="0016edb42d47806d7e4f90ae6f2239bd97a345dab6674d42eefa1cbcd91f5ec4" Apr 17 11:17:34.485913 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:17:34.485892 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0016edb42d47806d7e4f90ae6f2239bd97a345dab6674d42eefa1cbcd91f5ec4\": container with ID starting with 0016edb42d47806d7e4f90ae6f2239bd97a345dab6674d42eefa1cbcd91f5ec4 not found: ID does not exist" containerID="0016edb42d47806d7e4f90ae6f2239bd97a345dab6674d42eefa1cbcd91f5ec4" Apr 17 11:17:34.485965 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:34.485920 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0016edb42d47806d7e4f90ae6f2239bd97a345dab6674d42eefa1cbcd91f5ec4"} err="failed to get container status \"0016edb42d47806d7e4f90ae6f2239bd97a345dab6674d42eefa1cbcd91f5ec4\": rpc error: code = NotFound desc = could not find container \"0016edb42d47806d7e4f90ae6f2239bd97a345dab6674d42eefa1cbcd91f5ec4\": container with ID starting with 0016edb42d47806d7e4f90ae6f2239bd97a345dab6674d42eefa1cbcd91f5ec4 not found: ID does not exist" Apr 17 11:17:34.496288 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:34.496268 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-654774794f-gl7h8"] Apr 17 11:17:34.498351 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:34.498333 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-654774794f-gl7h8"] Apr 17 11:17:35.703546 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:35.703491 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76d3f378-0b89-4691-a527-31be2c7922be" path="/var/lib/kubelet/pods/76d3f378-0b89-4691-a527-31be2c7922be/volumes" Apr 17 11:17:36.481590 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:36.481551 2579 generic.go:358] "Generic (PLEG): container finished" podID="e2778730-467e-4432-9a1a-d5f871276f6d" containerID="21d1de5adf9de028226027355f32c884a7981de5710e43f398f0df629341a91b" exitCode=0 Apr 17 11:17:36.482204 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:36.481637 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-clmj2" event={"ID":"e2778730-467e-4432-9a1a-d5f871276f6d","Type":"ContainerDied","Data":"21d1de5adf9de028226027355f32c884a7981de5710e43f398f0df629341a91b"} Apr 17 11:17:36.482204 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:36.482098 2579 scope.go:117] "RemoveContainer" containerID="21d1de5adf9de028226027355f32c884a7981de5710e43f398f0df629341a91b" Apr 17 11:17:36.483306 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:36.483277 2579 generic.go:358] "Generic (PLEG): container finished" podID="b2c84c0f-0d60-4465-b1fd-4f39963e95d4" containerID="e8abba0ae748cf70ef095c45bd134dc8c42dc0475e7e81b63b03ae1ef3a87b81" exitCode=0 Apr 17 11:17:36.483439 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:36.483314 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6hppm" event={"ID":"b2c84c0f-0d60-4465-b1fd-4f39963e95d4","Type":"ContainerDied","Data":"e8abba0ae748cf70ef095c45bd134dc8c42dc0475e7e81b63b03ae1ef3a87b81"} Apr 17 11:17:36.483619 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:36.483599 2579 scope.go:117] "RemoveContainer" containerID="e8abba0ae748cf70ef095c45bd134dc8c42dc0475e7e81b63b03ae1ef3a87b81" Apr 17 11:17:37.116891 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:37.116848 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" podUID="beb961a2-0877-4cd8-be68-062f895cef5d" containerName="registry" containerID="cri-o://6ae9733fe580e2c9f90a09ee0c477c0f23172370d85ce4a945c0f22de9e643ab" gracePeriod=30 Apr 17 11:17:37.395858 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:37.395835 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" Apr 17 11:17:37.488501 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:37.488457 2579 generic.go:358] "Generic (PLEG): container finished" podID="beb961a2-0877-4cd8-be68-062f895cef5d" containerID="6ae9733fe580e2c9f90a09ee0c477c0f23172370d85ce4a945c0f22de9e643ab" exitCode=0 Apr 17 11:17:37.488691 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:37.488519 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" Apr 17 11:17:37.488691 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:37.488548 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" event={"ID":"beb961a2-0877-4cd8-be68-062f895cef5d","Type":"ContainerDied","Data":"6ae9733fe580e2c9f90a09ee0c477c0f23172370d85ce4a945c0f22de9e643ab"} Apr 17 11:17:37.488691 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:37.488589 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-799db9ddcb-l92ps" event={"ID":"beb961a2-0877-4cd8-be68-062f895cef5d","Type":"ContainerDied","Data":"964f2f4bb0ee9a592f102bf52c23c2d1fb1f9c8398df6a134d5bc1331abef4d4"} Apr 17 11:17:37.488691 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:37.488603 2579 scope.go:117] "RemoveContainer" containerID="6ae9733fe580e2c9f90a09ee0c477c0f23172370d85ce4a945c0f22de9e643ab" Apr 17 11:17:37.490294 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:37.490270 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-clmj2" event={"ID":"e2778730-467e-4432-9a1a-d5f871276f6d","Type":"ContainerStarted","Data":"b5a20ef7224f5e6cf1822e7b8d490b03c8292992c8664294dc9a37c845e4ccf3"} Apr 17 11:17:37.491938 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:37.491918 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6hppm" event={"ID":"b2c84c0f-0d60-4465-b1fd-4f39963e95d4","Type":"ContainerStarted","Data":"8f1d5cd00af1a6a642177f23166662d0fa84766673abdb51b9802d70df4bba74"} Apr 17 11:17:37.496796 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:37.496780 2579 scope.go:117] "RemoveContainer" containerID="6ae9733fe580e2c9f90a09ee0c477c0f23172370d85ce4a945c0f22de9e643ab" Apr 17 11:17:37.497059 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:17:37.497035 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ae9733fe580e2c9f90a09ee0c477c0f23172370d85ce4a945c0f22de9e643ab\": container with ID starting with 6ae9733fe580e2c9f90a09ee0c477c0f23172370d85ce4a945c0f22de9e643ab not found: ID does not exist" containerID="6ae9733fe580e2c9f90a09ee0c477c0f23172370d85ce4a945c0f22de9e643ab" Apr 17 11:17:37.497133 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:37.497066 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ae9733fe580e2c9f90a09ee0c477c0f23172370d85ce4a945c0f22de9e643ab"} err="failed to get container status \"6ae9733fe580e2c9f90a09ee0c477c0f23172370d85ce4a945c0f22de9e643ab\": rpc error: code = NotFound desc = could not find container \"6ae9733fe580e2c9f90a09ee0c477c0f23172370d85ce4a945c0f22de9e643ab\": container with ID starting with 6ae9733fe580e2c9f90a09ee0c477c0f23172370d85ce4a945c0f22de9e643ab not found: ID does not exist" Apr 17 11:17:37.547399 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:37.547371 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/beb961a2-0877-4cd8-be68-062f895cef5d-registry-tls\") pod \"beb961a2-0877-4cd8-be68-062f895cef5d\" (UID: \"beb961a2-0877-4cd8-be68-062f895cef5d\") " Apr 17 11:17:37.547582 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:37.547422 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/beb961a2-0877-4cd8-be68-062f895cef5d-image-registry-private-configuration\") pod \"beb961a2-0877-4cd8-be68-062f895cef5d\" (UID: \"beb961a2-0877-4cd8-be68-062f895cef5d\") " Apr 17 11:17:37.547582 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:37.547467 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/beb961a2-0877-4cd8-be68-062f895cef5d-installation-pull-secrets\") pod \"beb961a2-0877-4cd8-be68-062f895cef5d\" (UID: \"beb961a2-0877-4cd8-be68-062f895cef5d\") " Apr 17 11:17:37.547582 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:37.547506 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/beb961a2-0877-4cd8-be68-062f895cef5d-ca-trust-extracted\") pod \"beb961a2-0877-4cd8-be68-062f895cef5d\" (UID: \"beb961a2-0877-4cd8-be68-062f895cef5d\") " Apr 17 11:17:37.547582 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:37.547544 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mwlm\" (UniqueName: \"kubernetes.io/projected/beb961a2-0877-4cd8-be68-062f895cef5d-kube-api-access-4mwlm\") pod \"beb961a2-0877-4cd8-be68-062f895cef5d\" (UID: \"beb961a2-0877-4cd8-be68-062f895cef5d\") " Apr 17 11:17:37.547582 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:37.547577 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/beb961a2-0877-4cd8-be68-062f895cef5d-bound-sa-token\") pod \"beb961a2-0877-4cd8-be68-062f895cef5d\" (UID: \"beb961a2-0877-4cd8-be68-062f895cef5d\") " Apr 17 11:17:37.549007 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:37.547606 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/beb961a2-0877-4cd8-be68-062f895cef5d-registry-certificates\") pod \"beb961a2-0877-4cd8-be68-062f895cef5d\" (UID: \"beb961a2-0877-4cd8-be68-062f895cef5d\") " Apr 17 11:17:37.549007 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:37.547643 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/beb961a2-0877-4cd8-be68-062f895cef5d-trusted-ca\") pod \"beb961a2-0877-4cd8-be68-062f895cef5d\" (UID: \"beb961a2-0877-4cd8-be68-062f895cef5d\") " Apr 17 11:17:37.549007 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:37.548436 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/beb961a2-0877-4cd8-be68-062f895cef5d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "beb961a2-0877-4cd8-be68-062f895cef5d" (UID: "beb961a2-0877-4cd8-be68-062f895cef5d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:17:37.549007 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:37.548631 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/beb961a2-0877-4cd8-be68-062f895cef5d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "beb961a2-0877-4cd8-be68-062f895cef5d" (UID: "beb961a2-0877-4cd8-be68-062f895cef5d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:17:37.550008 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:37.549981 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beb961a2-0877-4cd8-be68-062f895cef5d-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "beb961a2-0877-4cd8-be68-062f895cef5d" (UID: "beb961a2-0877-4cd8-be68-062f895cef5d"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:17:37.550355 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:37.550331 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beb961a2-0877-4cd8-be68-062f895cef5d-kube-api-access-4mwlm" (OuterVolumeSpecName: "kube-api-access-4mwlm") pod "beb961a2-0877-4cd8-be68-062f895cef5d" (UID: "beb961a2-0877-4cd8-be68-062f895cef5d"). InnerVolumeSpecName "kube-api-access-4mwlm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:17:37.550462 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:37.550432 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beb961a2-0877-4cd8-be68-062f895cef5d-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "beb961a2-0877-4cd8-be68-062f895cef5d" (UID: "beb961a2-0877-4cd8-be68-062f895cef5d"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:17:37.551440 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:37.551416 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beb961a2-0877-4cd8-be68-062f895cef5d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "beb961a2-0877-4cd8-be68-062f895cef5d" (UID: "beb961a2-0877-4cd8-be68-062f895cef5d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:17:37.552153 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:37.552124 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beb961a2-0877-4cd8-be68-062f895cef5d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "beb961a2-0877-4cd8-be68-062f895cef5d" (UID: "beb961a2-0877-4cd8-be68-062f895cef5d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:17:37.558041 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:37.558017 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/beb961a2-0877-4cd8-be68-062f895cef5d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "beb961a2-0877-4cd8-be68-062f895cef5d" (UID: "beb961a2-0877-4cd8-be68-062f895cef5d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:17:37.651035 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:37.651003 2579 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/beb961a2-0877-4cd8-be68-062f895cef5d-installation-pull-secrets\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Apr 17 11:17:37.651244 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:37.651040 2579 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/beb961a2-0877-4cd8-be68-062f895cef5d-ca-trust-extracted\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Apr 17 11:17:37.651244 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:37.651057 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4mwlm\" (UniqueName: \"kubernetes.io/projected/beb961a2-0877-4cd8-be68-062f895cef5d-kube-api-access-4mwlm\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Apr 17 11:17:37.651244 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:37.651074 2579 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/beb961a2-0877-4cd8-be68-062f895cef5d-bound-sa-token\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Apr 17 11:17:37.651244 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:37.651090 2579 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/beb961a2-0877-4cd8-be68-062f895cef5d-registry-certificates\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Apr 17 11:17:37.651244 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:37.651105 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/beb961a2-0877-4cd8-be68-062f895cef5d-trusted-ca\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Apr 17 11:17:37.651244 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:37.651120 2579 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/beb961a2-0877-4cd8-be68-062f895cef5d-registry-tls\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Apr 17 11:17:37.651244 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:37.651136 2579 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/beb961a2-0877-4cd8-be68-062f895cef5d-image-registry-private-configuration\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Apr 17 11:17:37.804021 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:37.803987 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-799db9ddcb-l92ps"] Apr 17 11:17:37.807745 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:37.807712 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-799db9ddcb-l92ps"] Apr 17 11:17:39.704716 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:39.704677 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="beb961a2-0877-4cd8-be68-062f895cef5d" path="/var/lib/kubelet/pods/beb961a2-0877-4cd8-be68-062f895cef5d/volumes" Apr 17 11:17:50.515294 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:50.515246 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-57b77fd4b5-89vv8" podUID="316ca251-c7ff-4d30-9524-f9a64784a1ac" containerName="console" containerID="cri-o://b215e9e316734e45e831dbc99d736c5fb558d8824679cfe50390fc4e1fa3e31e" gracePeriod=15 Apr 17 11:17:50.875457 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:50.875434 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-57b77fd4b5-89vv8_316ca251-c7ff-4d30-9524-f9a64784a1ac/console/0.log" Apr 17 11:17:50.875589 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:50.875497 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57b77fd4b5-89vv8" Apr 17 11:17:50.959383 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:50.959345 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcw7w\" (UniqueName: \"kubernetes.io/projected/316ca251-c7ff-4d30-9524-f9a64784a1ac-kube-api-access-kcw7w\") pod \"316ca251-c7ff-4d30-9524-f9a64784a1ac\" (UID: \"316ca251-c7ff-4d30-9524-f9a64784a1ac\") " Apr 17 11:17:50.959693 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:50.959655 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/316ca251-c7ff-4d30-9524-f9a64784a1ac-console-config\") pod \"316ca251-c7ff-4d30-9524-f9a64784a1ac\" (UID: \"316ca251-c7ff-4d30-9524-f9a64784a1ac\") " Apr 17 11:17:50.959862 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:50.959847 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/316ca251-c7ff-4d30-9524-f9a64784a1ac-service-ca\") pod \"316ca251-c7ff-4d30-9524-f9a64784a1ac\" (UID: \"316ca251-c7ff-4d30-9524-f9a64784a1ac\") " Apr 17 11:17:50.960014 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:50.959998 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/316ca251-c7ff-4d30-9524-f9a64784a1ac-oauth-serving-cert\") pod \"316ca251-c7ff-4d30-9524-f9a64784a1ac\" (UID: \"316ca251-c7ff-4d30-9524-f9a64784a1ac\") " Apr 17 11:17:50.960156 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:50.960139 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/316ca251-c7ff-4d30-9524-f9a64784a1ac-trusted-ca-bundle\") pod \"316ca251-c7ff-4d30-9524-f9a64784a1ac\" (UID: \"316ca251-c7ff-4d30-9524-f9a64784a1ac\") " Apr 17 11:17:50.960402 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:50.960378 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/316ca251-c7ff-4d30-9524-f9a64784a1ac-console-oauth-config\") pod \"316ca251-c7ff-4d30-9524-f9a64784a1ac\" (UID: \"316ca251-c7ff-4d30-9524-f9a64784a1ac\") " Apr 17 11:17:50.960564 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:50.960542 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/316ca251-c7ff-4d30-9524-f9a64784a1ac-console-serving-cert\") pod \"316ca251-c7ff-4d30-9524-f9a64784a1ac\" (UID: \"316ca251-c7ff-4d30-9524-f9a64784a1ac\") " Apr 17 11:17:50.961997 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:50.960461 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/316ca251-c7ff-4d30-9524-f9a64784a1ac-service-ca" (OuterVolumeSpecName: "service-ca") pod "316ca251-c7ff-4d30-9524-f9a64784a1ac" (UID: "316ca251-c7ff-4d30-9524-f9a64784a1ac"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:17:50.961997 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:50.960694 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/316ca251-c7ff-4d30-9524-f9a64784a1ac-console-config" (OuterVolumeSpecName: "console-config") pod "316ca251-c7ff-4d30-9524-f9a64784a1ac" (UID: "316ca251-c7ff-4d30-9524-f9a64784a1ac"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:17:50.961997 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:50.960970 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/316ca251-c7ff-4d30-9524-f9a64784a1ac-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "316ca251-c7ff-4d30-9524-f9a64784a1ac" (UID: "316ca251-c7ff-4d30-9524-f9a64784a1ac"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:17:50.961997 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:50.961203 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/316ca251-c7ff-4d30-9524-f9a64784a1ac-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "316ca251-c7ff-4d30-9524-f9a64784a1ac" (UID: "316ca251-c7ff-4d30-9524-f9a64784a1ac"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:17:50.964241 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:50.964197 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316ca251-c7ff-4d30-9524-f9a64784a1ac-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "316ca251-c7ff-4d30-9524-f9a64784a1ac" (UID: "316ca251-c7ff-4d30-9524-f9a64784a1ac"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:17:50.974427 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:50.974056 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/316ca251-c7ff-4d30-9524-f9a64784a1ac-kube-api-access-kcw7w" (OuterVolumeSpecName: "kube-api-access-kcw7w") pod "316ca251-c7ff-4d30-9524-f9a64784a1ac" (UID: "316ca251-c7ff-4d30-9524-f9a64784a1ac"). InnerVolumeSpecName "kube-api-access-kcw7w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:17:50.976967 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:50.976920 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316ca251-c7ff-4d30-9524-f9a64784a1ac-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "316ca251-c7ff-4d30-9524-f9a64784a1ac" (UID: "316ca251-c7ff-4d30-9524-f9a64784a1ac"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:17:51.061675 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:51.061462 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/316ca251-c7ff-4d30-9524-f9a64784a1ac-console-config\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Apr 17 11:17:51.061675 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:51.061501 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/316ca251-c7ff-4d30-9524-f9a64784a1ac-service-ca\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Apr 17 11:17:51.061675 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:51.061516 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/316ca251-c7ff-4d30-9524-f9a64784a1ac-oauth-serving-cert\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Apr 17 11:17:51.061675 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:51.061530 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/316ca251-c7ff-4d30-9524-f9a64784a1ac-trusted-ca-bundle\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Apr 17 11:17:51.061675 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:51.061545 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/316ca251-c7ff-4d30-9524-f9a64784a1ac-console-oauth-config\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Apr 17 11:17:51.061675 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:51.061563 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/316ca251-c7ff-4d30-9524-f9a64784a1ac-console-serving-cert\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Apr 17 11:17:51.061675 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:51.061579 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kcw7w\" (UniqueName: \"kubernetes.io/projected/316ca251-c7ff-4d30-9524-f9a64784a1ac-kube-api-access-kcw7w\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Apr 17 11:17:51.543287 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:51.543253 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-57b77fd4b5-89vv8_316ca251-c7ff-4d30-9524-f9a64784a1ac/console/0.log" Apr 17 11:17:51.543718 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:51.543297 2579 generic.go:358] "Generic (PLEG): container finished" podID="316ca251-c7ff-4d30-9524-f9a64784a1ac" containerID="b215e9e316734e45e831dbc99d736c5fb558d8824679cfe50390fc4e1fa3e31e" exitCode=2 Apr 17 11:17:51.543718 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:51.543358 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57b77fd4b5-89vv8" event={"ID":"316ca251-c7ff-4d30-9524-f9a64784a1ac","Type":"ContainerDied","Data":"b215e9e316734e45e831dbc99d736c5fb558d8824679cfe50390fc4e1fa3e31e"} Apr 17 11:17:51.543718 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:51.543385 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57b77fd4b5-89vv8" event={"ID":"316ca251-c7ff-4d30-9524-f9a64784a1ac","Type":"ContainerDied","Data":"743ab75fbe9395131b3404d06e9fbe7b559d7d9273dccc56a3357a69d9336b06"} Apr 17 11:17:51.543718 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:51.543400 2579 scope.go:117] "RemoveContainer" containerID="b215e9e316734e45e831dbc99d736c5fb558d8824679cfe50390fc4e1fa3e31e" Apr 17 11:17:51.543718 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:51.543362 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57b77fd4b5-89vv8" Apr 17 11:17:51.552743 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:51.552725 2579 scope.go:117] "RemoveContainer" containerID="b215e9e316734e45e831dbc99d736c5fb558d8824679cfe50390fc4e1fa3e31e" Apr 17 11:17:51.553028 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:17:51.553008 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b215e9e316734e45e831dbc99d736c5fb558d8824679cfe50390fc4e1fa3e31e\": container with ID starting with b215e9e316734e45e831dbc99d736c5fb558d8824679cfe50390fc4e1fa3e31e not found: ID does not exist" containerID="b215e9e316734e45e831dbc99d736c5fb558d8824679cfe50390fc4e1fa3e31e" Apr 17 11:17:51.553099 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:51.553036 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b215e9e316734e45e831dbc99d736c5fb558d8824679cfe50390fc4e1fa3e31e"} err="failed to get container status \"b215e9e316734e45e831dbc99d736c5fb558d8824679cfe50390fc4e1fa3e31e\": rpc error: code = NotFound desc = could not find container \"b215e9e316734e45e831dbc99d736c5fb558d8824679cfe50390fc4e1fa3e31e\": container with ID starting with b215e9e316734e45e831dbc99d736c5fb558d8824679cfe50390fc4e1fa3e31e not found: ID does not exist" Apr 17 11:17:51.565278 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:51.565251 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-57b77fd4b5-89vv8"] Apr 17 11:17:51.568023 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:51.567996 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-57b77fd4b5-89vv8"] Apr 17 11:17:51.704088 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:17:51.704046 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="316ca251-c7ff-4d30-9524-f9a64784a1ac" path="/var/lib/kubelet/pods/316ca251-c7ff-4d30-9524-f9a64784a1ac/volumes" Apr 17 11:18:18.376503 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:18:18.376465 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-67cbf7fb8c-9zs8r"] Apr 17 11:18:43.397032 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:18:43.396967 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-67cbf7fb8c-9zs8r" podUID="6ecada3a-57c6-492e-b4b2-8e738576c1c4" containerName="console" containerID="cri-o://f6788711607b3c67fb35551fe87b6634b67f4d5a4490025807fe611acc351241" gracePeriod=15 Apr 17 11:18:43.635996 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:18:43.635970 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-67cbf7fb8c-9zs8r_6ecada3a-57c6-492e-b4b2-8e738576c1c4/console/0.log" Apr 17 11:18:43.636108 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:18:43.636031 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67cbf7fb8c-9zs8r" Apr 17 11:18:43.697117 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:18:43.697089 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6ecada3a-57c6-492e-b4b2-8e738576c1c4-console-config\") pod \"6ecada3a-57c6-492e-b4b2-8e738576c1c4\" (UID: \"6ecada3a-57c6-492e-b4b2-8e738576c1c4\") " Apr 17 11:18:43.697117 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:18:43.697121 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6ecada3a-57c6-492e-b4b2-8e738576c1c4-console-serving-cert\") pod \"6ecada3a-57c6-492e-b4b2-8e738576c1c4\" (UID: \"6ecada3a-57c6-492e-b4b2-8e738576c1c4\") " Apr 17 11:18:43.697348 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:18:43.697150 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6ecada3a-57c6-492e-b4b2-8e738576c1c4-service-ca\") pod \"6ecada3a-57c6-492e-b4b2-8e738576c1c4\" (UID: \"6ecada3a-57c6-492e-b4b2-8e738576c1c4\") " Apr 17 11:18:43.697348 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:18:43.697166 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6ecada3a-57c6-492e-b4b2-8e738576c1c4-console-oauth-config\") pod \"6ecada3a-57c6-492e-b4b2-8e738576c1c4\" (UID: \"6ecada3a-57c6-492e-b4b2-8e738576c1c4\") " Apr 17 11:18:43.697348 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:18:43.697293 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgddd\" (UniqueName: \"kubernetes.io/projected/6ecada3a-57c6-492e-b4b2-8e738576c1c4-kube-api-access-jgddd\") pod \"6ecada3a-57c6-492e-b4b2-8e738576c1c4\" (UID: \"6ecada3a-57c6-492e-b4b2-8e738576c1c4\") " Apr 17 11:18:43.697495 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:18:43.697357 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6ecada3a-57c6-492e-b4b2-8e738576c1c4-oauth-serving-cert\") pod \"6ecada3a-57c6-492e-b4b2-8e738576c1c4\" (UID: \"6ecada3a-57c6-492e-b4b2-8e738576c1c4\") " Apr 17 11:18:43.697555 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:18:43.697532 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ecada3a-57c6-492e-b4b2-8e738576c1c4-trusted-ca-bundle\") pod \"6ecada3a-57c6-492e-b4b2-8e738576c1c4\" (UID: \"6ecada3a-57c6-492e-b4b2-8e738576c1c4\") " Apr 17 11:18:43.697555 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:18:43.697540 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ecada3a-57c6-492e-b4b2-8e738576c1c4-console-config" (OuterVolumeSpecName: "console-config") pod "6ecada3a-57c6-492e-b4b2-8e738576c1c4" (UID: "6ecada3a-57c6-492e-b4b2-8e738576c1c4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:18:43.697666 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:18:43.697558 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ecada3a-57c6-492e-b4b2-8e738576c1c4-service-ca" (OuterVolumeSpecName: "service-ca") pod "6ecada3a-57c6-492e-b4b2-8e738576c1c4" (UID: "6ecada3a-57c6-492e-b4b2-8e738576c1c4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:18:43.697666 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:18:43.697641 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ecada3a-57c6-492e-b4b2-8e738576c1c4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6ecada3a-57c6-492e-b4b2-8e738576c1c4" (UID: "6ecada3a-57c6-492e-b4b2-8e738576c1c4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:18:43.697877 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:18:43.697859 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6ecada3a-57c6-492e-b4b2-8e738576c1c4-service-ca\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Apr 17 11:18:43.697924 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:18:43.697885 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6ecada3a-57c6-492e-b4b2-8e738576c1c4-oauth-serving-cert\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Apr 17 11:18:43.697924 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:18:43.697901 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6ecada3a-57c6-492e-b4b2-8e738576c1c4-console-config\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Apr 17 11:18:43.698037 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:18:43.698012 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ecada3a-57c6-492e-b4b2-8e738576c1c4-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6ecada3a-57c6-492e-b4b2-8e738576c1c4" (UID: "6ecada3a-57c6-492e-b4b2-8e738576c1c4"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:18:43.699328 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:18:43.699305 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ecada3a-57c6-492e-b4b2-8e738576c1c4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6ecada3a-57c6-492e-b4b2-8e738576c1c4" (UID: "6ecada3a-57c6-492e-b4b2-8e738576c1c4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:18:43.699453 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:18:43.699430 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ecada3a-57c6-492e-b4b2-8e738576c1c4-kube-api-access-jgddd" (OuterVolumeSpecName: "kube-api-access-jgddd") pod "6ecada3a-57c6-492e-b4b2-8e738576c1c4" (UID: "6ecada3a-57c6-492e-b4b2-8e738576c1c4"). InnerVolumeSpecName "kube-api-access-jgddd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:18:43.699515 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:18:43.699481 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ecada3a-57c6-492e-b4b2-8e738576c1c4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6ecada3a-57c6-492e-b4b2-8e738576c1c4" (UID: "6ecada3a-57c6-492e-b4b2-8e738576c1c4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:18:43.705599 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:18:43.705583 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-67cbf7fb8c-9zs8r_6ecada3a-57c6-492e-b4b2-8e738576c1c4/console/0.log" Apr 17 11:18:43.705699 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:18:43.705619 2579 generic.go:358] "Generic (PLEG): container finished" podID="6ecada3a-57c6-492e-b4b2-8e738576c1c4" containerID="f6788711607b3c67fb35551fe87b6634b67f4d5a4490025807fe611acc351241" exitCode=2 Apr 17 11:18:43.705699 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:18:43.705643 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67cbf7fb8c-9zs8r" event={"ID":"6ecada3a-57c6-492e-b4b2-8e738576c1c4","Type":"ContainerDied","Data":"f6788711607b3c67fb35551fe87b6634b67f4d5a4490025807fe611acc351241"} Apr 17 11:18:43.705699 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:18:43.705678 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67cbf7fb8c-9zs8r" event={"ID":"6ecada3a-57c6-492e-b4b2-8e738576c1c4","Type":"ContainerDied","Data":"3f25b90a2068f28cc9401bda915de4c183d830d17f6791e5065f7ea9b48915ed"} Apr 17 11:18:43.705699 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:18:43.705696 2579 scope.go:117] "RemoveContainer" containerID="f6788711607b3c67fb35551fe87b6634b67f4d5a4490025807fe611acc351241" Apr 17 11:18:43.705909 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:18:43.705701 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67cbf7fb8c-9zs8r" Apr 17 11:18:43.718652 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:18:43.718634 2579 scope.go:117] "RemoveContainer" containerID="f6788711607b3c67fb35551fe87b6634b67f4d5a4490025807fe611acc351241" Apr 17 11:18:43.718939 ip-10-0-133-190 kubenswrapper[2579]: E0417 11:18:43.718921 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6788711607b3c67fb35551fe87b6634b67f4d5a4490025807fe611acc351241\": container with ID starting with f6788711607b3c67fb35551fe87b6634b67f4d5a4490025807fe611acc351241 not found: ID does not exist" containerID="f6788711607b3c67fb35551fe87b6634b67f4d5a4490025807fe611acc351241" Apr 17 11:18:43.718989 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:18:43.718947 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6788711607b3c67fb35551fe87b6634b67f4d5a4490025807fe611acc351241"} err="failed to get container status \"f6788711607b3c67fb35551fe87b6634b67f4d5a4490025807fe611acc351241\": rpc error: code = NotFound desc = could not find container \"f6788711607b3c67fb35551fe87b6634b67f4d5a4490025807fe611acc351241\": container with ID starting with f6788711607b3c67fb35551fe87b6634b67f4d5a4490025807fe611acc351241 not found: ID does not exist" Apr 17 11:18:43.737423 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:18:43.737397 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-67cbf7fb8c-9zs8r"] Apr 17 11:18:43.750237 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:18:43.750213 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-67cbf7fb8c-9zs8r"] Apr 17 11:18:43.799047 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:18:43.799025 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6ecada3a-57c6-492e-b4b2-8e738576c1c4-console-serving-cert\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Apr 17 11:18:43.799047 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:18:43.799047 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6ecada3a-57c6-492e-b4b2-8e738576c1c4-console-oauth-config\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Apr 17 11:18:43.799171 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:18:43.799060 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jgddd\" (UniqueName: \"kubernetes.io/projected/6ecada3a-57c6-492e-b4b2-8e738576c1c4-kube-api-access-jgddd\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Apr 17 11:18:43.799171 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:18:43.799069 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ecada3a-57c6-492e-b4b2-8e738576c1c4-trusted-ca-bundle\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Apr 17 11:18:45.704114 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:18:45.704080 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ecada3a-57c6-492e-b4b2-8e738576c1c4" path="/var/lib/kubelet/pods/6ecada3a-57c6-492e-b4b2-8e738576c1c4/volumes" Apr 17 11:20:37.644294 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:20:37.644266 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9wmp5_89b9086c-55f6-4d0b-a998-d22a793d7d17/console-operator/1.log" Apr 17 11:20:37.645022 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:20:37.644989 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9wmp5_89b9086c-55f6-4d0b-a998-d22a793d7d17/console-operator/1.log" Apr 17 11:20:37.650825 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:20:37.650804 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lxcn4_431e03f9-9af4-4fa7-8f47-c50f52e2a7e5/ovn-acl-logging/0.log" Apr 17 11:20:37.651410 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:20:37.651390 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lxcn4_431e03f9-9af4-4fa7-8f47-c50f52e2a7e5/ovn-acl-logging/0.log" Apr 17 11:20:37.654637 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:20:37.654619 2579 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 11:23:56.926978 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:23:56.926947 2579 ???:1] "http: TLS handshake error from 10.0.130.188:33142: EOF" Apr 17 11:23:56.928634 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:23:56.928615 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-qv726_65ec91b4-9666-4574-ad59-be0c3d01c971/global-pull-secret-syncer/0.log" Apr 17 11:23:57.006066 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:23:57.006038 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-46jrw_5652250a-f654-49f4-a3fa-82e77fa0b777/konnectivity-agent/0.log" Apr 17 11:23:57.277200 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:23:57.277131 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-190.ec2.internal_1a8754b0f8554418b50db3ceda052dae/haproxy/0.log" Apr 17 11:24:00.266528 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:00.266493 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-65j5q_cfa96c8c-1c5c-4749-a74a-b6eab2274afd/cluster-monitoring-operator/0.log" Apr 17 11:24:00.783948 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:00.783921 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dm5sp_0db0e0ab-e7ee-4975-8084-8c78f76a35ce/node-exporter/0.log" Apr 17 11:24:00.820104 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:00.820068 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dm5sp_0db0e0ab-e7ee-4975-8084-8c78f76a35ce/kube-rbac-proxy/0.log" Apr 17 11:24:00.840173 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:00.840145 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dm5sp_0db0e0ab-e7ee-4975-8084-8c78f76a35ce/init-textfile/0.log" Apr 17 11:24:01.224019 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:01.223986 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-w9zc9_ddb1e496-a5cc-4d00-9aef-856c4b501cf7/prometheus-operator-admission-webhook/0.log" Apr 17 11:24:02.603937 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:02.603899 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-xjjnj_d90dfee2-676e-4224-8ec8-8d764b523802/networking-console-plugin/0.log" Apr 17 11:24:03.080993 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:03.080962 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9wmp5_89b9086c-55f6-4d0b-a998-d22a793d7d17/console-operator/1.log" Apr 17 11:24:03.086522 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:03.086487 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9wmp5_89b9086c-55f6-4d0b-a998-d22a793d7d17/console-operator/2.log" Apr 17 11:24:03.520387 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:03.520359 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-cqq7h_9473d957-8483-46e1-86cf-c51754d8c054/download-server/0.log" Apr 17 11:24:03.946222 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:03.946200 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-dl54m_92e7a74f-35b8-4f70-833b-1261a2bed50d/volume-data-source-validator/0.log" Apr 17 11:24:04.544433 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.544402 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7s657/perf-node-gather-daemonset-96h44"] Apr 17 11:24:04.544749 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.544735 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="316ca251-c7ff-4d30-9524-f9a64784a1ac" containerName="console" Apr 17 11:24:04.544816 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.544751 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="316ca251-c7ff-4d30-9524-f9a64784a1ac" containerName="console" Apr 17 11:24:04.544816 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.544761 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="76d3f378-0b89-4691-a527-31be2c7922be" containerName="console" Apr 17 11:24:04.544816 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.544790 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="76d3f378-0b89-4691-a527-31be2c7922be" containerName="console" Apr 17 11:24:04.544816 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.544804 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="beb961a2-0877-4cd8-be68-062f895cef5d" containerName="registry" Apr 17 11:24:04.544816 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.544814 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="beb961a2-0877-4cd8-be68-062f895cef5d" containerName="registry" Apr 17 11:24:04.544977 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.544858 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ecada3a-57c6-492e-b4b2-8e738576c1c4" containerName="console" Apr 17 11:24:04.544977 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.544868 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ecada3a-57c6-492e-b4b2-8e738576c1c4" containerName="console" Apr 17 11:24:04.544977 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.544941 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="76d3f378-0b89-4691-a527-31be2c7922be" containerName="console" Apr 17 11:24:04.544977 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.544955 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="beb961a2-0877-4cd8-be68-062f895cef5d" containerName="registry" Apr 17 11:24:04.544977 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.544963 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="316ca251-c7ff-4d30-9524-f9a64784a1ac" containerName="console" Apr 17 11:24:04.544977 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.544970 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="6ecada3a-57c6-492e-b4b2-8e738576c1c4" containerName="console" Apr 17 11:24:04.547949 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.547930 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7s657/perf-node-gather-daemonset-96h44" Apr 17 11:24:04.551787 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.551742 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7s657\"/\"openshift-service-ca.crt\"" Apr 17 11:24:04.552157 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.552137 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-7s657\"/\"default-dockercfg-7h8qt\"" Apr 17 11:24:04.552256 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.552144 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7s657\"/\"kube-root-ca.crt\"" Apr 17 11:24:04.567054 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.567030 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7s657/perf-node-gather-daemonset-96h44"] Apr 17 11:24:04.643992 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.643950 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3fffba7c-2f3c-41fb-81bf-a56625911ce4-proc\") pod \"perf-node-gather-daemonset-96h44\" (UID: \"3fffba7c-2f3c-41fb-81bf-a56625911ce4\") " pod="openshift-must-gather-7s657/perf-node-gather-daemonset-96h44" Apr 17 11:24:04.644156 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.644028 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3fffba7c-2f3c-41fb-81bf-a56625911ce4-sys\") pod \"perf-node-gather-daemonset-96h44\" (UID: \"3fffba7c-2f3c-41fb-81bf-a56625911ce4\") " pod="openshift-must-gather-7s657/perf-node-gather-daemonset-96h44" Apr 17 11:24:04.644156 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.644056 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fj77\" (UniqueName: \"kubernetes.io/projected/3fffba7c-2f3c-41fb-81bf-a56625911ce4-kube-api-access-2fj77\") pod \"perf-node-gather-daemonset-96h44\" (UID: \"3fffba7c-2f3c-41fb-81bf-a56625911ce4\") " pod="openshift-must-gather-7s657/perf-node-gather-daemonset-96h44" Apr 17 11:24:04.644156 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.644081 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3fffba7c-2f3c-41fb-81bf-a56625911ce4-podres\") pod \"perf-node-gather-daemonset-96h44\" (UID: \"3fffba7c-2f3c-41fb-81bf-a56625911ce4\") " pod="openshift-must-gather-7s657/perf-node-gather-daemonset-96h44" Apr 17 11:24:04.644156 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.644104 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3fffba7c-2f3c-41fb-81bf-a56625911ce4-lib-modules\") pod \"perf-node-gather-daemonset-96h44\" (UID: \"3fffba7c-2f3c-41fb-81bf-a56625911ce4\") " pod="openshift-must-gather-7s657/perf-node-gather-daemonset-96h44" Apr 17 11:24:04.745377 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.745336 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3fffba7c-2f3c-41fb-81bf-a56625911ce4-sys\") pod \"perf-node-gather-daemonset-96h44\" (UID: \"3fffba7c-2f3c-41fb-81bf-a56625911ce4\") " pod="openshift-must-gather-7s657/perf-node-gather-daemonset-96h44" Apr 17 11:24:04.745560 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.745390 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fj77\" (UniqueName: \"kubernetes.io/projected/3fffba7c-2f3c-41fb-81bf-a56625911ce4-kube-api-access-2fj77\") pod \"perf-node-gather-daemonset-96h44\" (UID: \"3fffba7c-2f3c-41fb-81bf-a56625911ce4\") " pod="openshift-must-gather-7s657/perf-node-gather-daemonset-96h44" Apr 17 11:24:04.745560 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.745470 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3fffba7c-2f3c-41fb-81bf-a56625911ce4-sys\") pod \"perf-node-gather-daemonset-96h44\" (UID: \"3fffba7c-2f3c-41fb-81bf-a56625911ce4\") " pod="openshift-must-gather-7s657/perf-node-gather-daemonset-96h44" Apr 17 11:24:04.745694 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.745592 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3fffba7c-2f3c-41fb-81bf-a56625911ce4-podres\") pod \"perf-node-gather-daemonset-96h44\" (UID: \"3fffba7c-2f3c-41fb-81bf-a56625911ce4\") " pod="openshift-must-gather-7s657/perf-node-gather-daemonset-96h44" Apr 17 11:24:04.745694 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.745652 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3fffba7c-2f3c-41fb-81bf-a56625911ce4-lib-modules\") pod \"perf-node-gather-daemonset-96h44\" (UID: \"3fffba7c-2f3c-41fb-81bf-a56625911ce4\") " pod="openshift-must-gather-7s657/perf-node-gather-daemonset-96h44" Apr 17 11:24:04.745810 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.745714 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3fffba7c-2f3c-41fb-81bf-a56625911ce4-proc\") pod \"perf-node-gather-daemonset-96h44\" (UID: \"3fffba7c-2f3c-41fb-81bf-a56625911ce4\") " pod="openshift-must-gather-7s657/perf-node-gather-daemonset-96h44" Apr 17 11:24:04.745810 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.745756 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3fffba7c-2f3c-41fb-81bf-a56625911ce4-podres\") pod \"perf-node-gather-daemonset-96h44\" (UID: \"3fffba7c-2f3c-41fb-81bf-a56625911ce4\") " pod="openshift-must-gather-7s657/perf-node-gather-daemonset-96h44" Apr 17 11:24:04.745810 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.745792 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3fffba7c-2f3c-41fb-81bf-a56625911ce4-lib-modules\") pod \"perf-node-gather-daemonset-96h44\" (UID: \"3fffba7c-2f3c-41fb-81bf-a56625911ce4\") " pod="openshift-must-gather-7s657/perf-node-gather-daemonset-96h44" Apr 17 11:24:04.745917 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.745824 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3fffba7c-2f3c-41fb-81bf-a56625911ce4-proc\") pod \"perf-node-gather-daemonset-96h44\" (UID: \"3fffba7c-2f3c-41fb-81bf-a56625911ce4\") " pod="openshift-must-gather-7s657/perf-node-gather-daemonset-96h44" Apr 17 11:24:04.753477 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.753448 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fj77\" (UniqueName: \"kubernetes.io/projected/3fffba7c-2f3c-41fb-81bf-a56625911ce4-kube-api-access-2fj77\") pod \"perf-node-gather-daemonset-96h44\" (UID: \"3fffba7c-2f3c-41fb-81bf-a56625911ce4\") " pod="openshift-must-gather-7s657/perf-node-gather-daemonset-96h44" Apr 17 11:24:04.807017 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.806949 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-tkcmp_4dd7876b-a6b7-4cf0-b645-979aead5bdff/dns/0.log" Apr 17 11:24:04.826822 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.826795 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-tkcmp_4dd7876b-a6b7-4cf0-b645-979aead5bdff/kube-rbac-proxy/0.log" Apr 17 11:24:04.857952 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.857928 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7s657/perf-node-gather-daemonset-96h44" Apr 17 11:24:04.873433 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.873411 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-jrxcq_7920eeb8-72c9-4fe5-aff5-30f78ed7f840/dns-node-resolver/0.log" Apr 17 11:24:04.980957 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.980928 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7s657/perf-node-gather-daemonset-96h44"] Apr 17 11:24:04.984156 ip-10-0-133-190 kubenswrapper[2579]: W0417 11:24:04.984128 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3fffba7c_2f3c_41fb_81bf_a56625911ce4.slice/crio-4472cd9f39152d8376452809665796f634d89c0191bd065a832bd28dbc1295b4 WatchSource:0}: Error finding container 4472cd9f39152d8376452809665796f634d89c0191bd065a832bd28dbc1295b4: Status 404 returned error can't find the container with id 4472cd9f39152d8376452809665796f634d89c0191bd065a832bd28dbc1295b4 Apr 17 11:24:04.985691 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:04.985673 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 11:24:05.355932 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:05.355853 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2678k_0a2631b2-add8-43b1-a9b4-b872018c7373/node-ca/0.log" Apr 17 11:24:05.638039 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:05.637955 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7s657/perf-node-gather-daemonset-96h44" event={"ID":"3fffba7c-2f3c-41fb-81bf-a56625911ce4","Type":"ContainerStarted","Data":"ddd23b4500fd10cf21c72037357145ae81caa01bb08ca65e003e65780d0cb548"} Apr 17 11:24:05.638039 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:05.637997 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7s657/perf-node-gather-daemonset-96h44" event={"ID":"3fffba7c-2f3c-41fb-81bf-a56625911ce4","Type":"ContainerStarted","Data":"4472cd9f39152d8376452809665796f634d89c0191bd065a832bd28dbc1295b4"} Apr 17 11:24:05.638219 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:05.638136 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-7s657/perf-node-gather-daemonset-96h44" Apr 17 11:24:05.654878 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:05.654832 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7s657/perf-node-gather-daemonset-96h44" podStartSLOduration=1.654813647 podStartE2EDuration="1.654813647s" podCreationTimestamp="2026-04-17 11:24:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:24:05.653274091 +0000 UTC m=+508.519798241" watchObservedRunningTime="2026-04-17 11:24:05.654813647 +0000 UTC m=+508.521337796" Apr 17 11:24:06.107969 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:06.107936 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7657d8478-pp7qf_615b2482-c02f-4752-9ea6-7e7cef5c1fe9/router/0.log" Apr 17 11:24:06.449071 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:06.449040 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-sswv7_05a441fa-9d9b-40d1-adfd-ffe296dfb2d0/serve-healthcheck-canary/0.log" Apr 17 11:24:06.819251 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:06.819176 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-clmj2_e2778730-467e-4432-9a1a-d5f871276f6d/insights-operator/1.log" Apr 17 11:24:06.819400 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:06.819349 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-clmj2_e2778730-467e-4432-9a1a-d5f871276f6d/insights-operator/0.log" Apr 17 11:24:06.961177 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:06.961152 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pp8hg_acfe3db2-72d7-40a4-bd0f-d52828be6509/kube-rbac-proxy/0.log" Apr 17 11:24:06.983081 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:06.983054 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pp8hg_acfe3db2-72d7-40a4-bd0f-d52828be6509/exporter/0.log" Apr 17 11:24:07.001982 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:07.001959 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pp8hg_acfe3db2-72d7-40a4-bd0f-d52828be6509/extractor/0.log" Apr 17 11:24:11.650740 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:11.650711 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-7s657/perf-node-gather-daemonset-96h44" Apr 17 11:24:12.896043 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:12.896015 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-w48tf_55842e83-3d9a-4294-a446-3bfe192d7a19/migrator/0.log" Apr 17 11:24:12.919648 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:12.919622 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-w48tf_55842e83-3d9a-4294-a446-3bfe192d7a19/graceful-termination/0.log" Apr 17 11:24:13.293649 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:13.293616 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-6hppm_b2c84c0f-0d60-4465-b1fd-4f39963e95d4/kube-storage-version-migrator-operator/1.log" Apr 17 11:24:13.294506 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:13.294486 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-6hppm_b2c84c0f-0d60-4465-b1fd-4f39963e95d4/kube-storage-version-migrator-operator/0.log" Apr 17 11:24:14.317668 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:14.317653 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-86dh5_389cf577-bd02-4903-96d9-cdc3fd99d418/kube-multus/0.log" Apr 17 11:24:14.670788 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:14.670746 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dvnsm_841bd702-cde2-4bf5-9789-aa664c501f8f/kube-multus-additional-cni-plugins/0.log" Apr 17 11:24:14.692657 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:14.692631 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dvnsm_841bd702-cde2-4bf5-9789-aa664c501f8f/egress-router-binary-copy/0.log" Apr 17 11:24:14.714365 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:14.714342 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dvnsm_841bd702-cde2-4bf5-9789-aa664c501f8f/cni-plugins/0.log" Apr 17 11:24:14.736829 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:14.736808 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dvnsm_841bd702-cde2-4bf5-9789-aa664c501f8f/bond-cni-plugin/0.log" Apr 17 11:24:14.759468 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:14.759448 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dvnsm_841bd702-cde2-4bf5-9789-aa664c501f8f/routeoverride-cni/0.log" Apr 17 11:24:14.782579 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:14.782558 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dvnsm_841bd702-cde2-4bf5-9789-aa664c501f8f/whereabouts-cni-bincopy/0.log" Apr 17 11:24:14.806472 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:14.806448 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dvnsm_841bd702-cde2-4bf5-9789-aa664c501f8f/whereabouts-cni/0.log" Apr 17 11:24:14.931924 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:14.931845 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-9g7pq_0ba74b24-e523-481e-82b5-080dc7ecb2e2/network-metrics-daemon/0.log" Apr 17 11:24:14.953801 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:14.953760 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-9g7pq_0ba74b24-e523-481e-82b5-080dc7ecb2e2/kube-rbac-proxy/0.log" Apr 17 11:24:15.843024 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:15.842942 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lxcn4_431e03f9-9af4-4fa7-8f47-c50f52e2a7e5/ovn-controller/0.log" Apr 17 11:24:15.861851 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:15.861823 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lxcn4_431e03f9-9af4-4fa7-8f47-c50f52e2a7e5/ovn-acl-logging/0.log" Apr 17 11:24:15.864091 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:15.864063 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lxcn4_431e03f9-9af4-4fa7-8f47-c50f52e2a7e5/ovn-acl-logging/1.log" Apr 17 11:24:15.881708 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:15.881675 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lxcn4_431e03f9-9af4-4fa7-8f47-c50f52e2a7e5/kube-rbac-proxy-node/0.log" Apr 17 11:24:15.902134 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:15.902110 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lxcn4_431e03f9-9af4-4fa7-8f47-c50f52e2a7e5/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 11:24:15.925678 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:15.925658 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lxcn4_431e03f9-9af4-4fa7-8f47-c50f52e2a7e5/northd/0.log" Apr 17 11:24:15.954096 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:15.954076 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lxcn4_431e03f9-9af4-4fa7-8f47-c50f52e2a7e5/nbdb/0.log" Apr 17 11:24:15.978221 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:15.978186 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lxcn4_431e03f9-9af4-4fa7-8f47-c50f52e2a7e5/sbdb/0.log" Apr 17 11:24:16.065949 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:16.065917 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lxcn4_431e03f9-9af4-4fa7-8f47-c50f52e2a7e5/ovnkube-controller/0.log" Apr 17 11:24:17.729968 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:17.729923 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-fdqlp_8a1529e3-e492-461e-9d34-440e5555b197/check-endpoints/0.log" Apr 17 11:24:17.756006 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:17.755977 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-47nt5_bfa20876-9d47-42bf-aad5-24503e05b86e/network-check-target-container/0.log" Apr 17 11:24:18.792642 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:18.792610 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-t9jv4_ec49f65a-cac1-4bb8-8dd5-f77b34ef2282/iptables-alerter/0.log" Apr 17 11:24:19.465876 ip-10-0-133-190 kubenswrapper[2579]: I0417 11:24:19.465850 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-zc9p4_bed77416-eb94-411f-885d-cc01490e88a0/tuned/0.log"