Apr 21 15:32:47.300505 ip-10-0-133-158 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 21 15:32:47.300519 ip-10-0-133-158 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 21 15:32:47.300529 ip-10-0-133-158 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 21 15:32:47.300865 ip-10-0-133-158 systemd[1]: Failed to start Kubernetes Kubelet. Apr 21 15:32:57.334317 ip-10-0-133-158 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 21 15:32:57.334337 ip-10-0-133-158 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 003a940ac36e4d788d9b1d9b85602e6d -- Apr 21 15:35:22.762409 ip-10-0-133-158 systemd[1]: Starting Kubernetes Kubelet... Apr 21 15:35:23.267296 ip-10-0-133-158 kubenswrapper[2569]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 15:35:23.267296 ip-10-0-133-158 kubenswrapper[2569]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 15:35:23.267296 ip-10-0-133-158 kubenswrapper[2569]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 15:35:23.267296 ip-10-0-133-158 kubenswrapper[2569]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 15:35:23.267296 ip-10-0-133-158 kubenswrapper[2569]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 15:35:23.270281 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.270191 2569 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 15:35:23.276982 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.276961 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:35:23.276982 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.276981 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:35:23.277056 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.276985 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:35:23.277056 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.276989 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:35:23.277056 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.276993 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:35:23.277056 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277002 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:35:23.277056 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277005 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:35:23.277056 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277009 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:35:23.277056 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277012 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:35:23.277056 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277014 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:35:23.277056 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277017 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:35:23.277056 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277019 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:35:23.277056 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277022 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:35:23.277056 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277025 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:35:23.277056 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277027 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:35:23.277056 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277030 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:35:23.277056 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277033 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:35:23.277056 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277035 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:35:23.277056 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277038 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:35:23.277056 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277041 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:35:23.277056 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277043 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:35:23.277056 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277045 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:35:23.277547 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277048 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:35:23.277547 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277050 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:35:23.277547 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277053 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:35:23.277547 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277056 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:35:23.277547 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277058 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:35:23.277547 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277061 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:35:23.277547 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277064 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:35:23.277547 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277067 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:35:23.277547 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277070 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:35:23.277547 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277073 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:35:23.277547 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277076 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:35:23.277547 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277078 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:35:23.277547 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277081 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:35:23.277547 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277084 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:35:23.277547 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277086 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:35:23.277547 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277089 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:35:23.277547 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277092 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:35:23.277547 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277094 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:35:23.277547 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277098 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:35:23.277547 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277102 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:35:23.278046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277105 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:35:23.278046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277108 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:35:23.278046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277110 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:35:23.278046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277113 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:35:23.278046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277115 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:35:23.278046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277118 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:35:23.278046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277120 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:35:23.278046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277123 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:35:23.278046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277125 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:35:23.278046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277141 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:35:23.278046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277144 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:35:23.278046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277146 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:35:23.278046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277148 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:35:23.278046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277153 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:35:23.278046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277156 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:35:23.278046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277164 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:35:23.278046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277167 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:35:23.278046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277171 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:35:23.278046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277174 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:35:23.278046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277176 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:35:23.278581 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277179 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:35:23.278581 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277182 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:35:23.278581 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277184 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:35:23.278581 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277187 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:35:23.278581 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277189 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:35:23.278581 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277194 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:35:23.278581 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277198 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:35:23.278581 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277201 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:35:23.278581 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277204 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:35:23.278581 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277207 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:35:23.278581 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277210 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:35:23.278581 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277213 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:35:23.278581 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277215 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:35:23.278581 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277218 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:35:23.278581 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277220 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:35:23.278581 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277223 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:35:23.278581 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277228 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:35:23.278581 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277231 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:35:23.278581 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277234 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:35:23.279046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277236 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:35:23.279046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277239 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:35:23.279046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277242 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:35:23.279046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277244 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:35:23.279046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277247 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:35:23.279046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277698 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:35:23.279046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277704 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:35:23.279046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277707 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:35:23.279046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277716 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:35:23.279046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277719 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:35:23.279046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277722 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:35:23.279046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277730 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:35:23.279046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277733 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:35:23.279046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277736 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:35:23.279046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277738 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:35:23.279046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277741 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:35:23.279046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277744 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:35:23.279046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277746 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:35:23.279046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277749 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:35:23.279046 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277751 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:35:23.279524 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277754 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:35:23.279524 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277757 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:35:23.279524 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277759 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:35:23.279524 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277762 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:35:23.279524 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277765 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:35:23.279524 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277768 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:35:23.279524 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277771 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:35:23.279524 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277774 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:35:23.279524 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277776 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:35:23.279524 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277780 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:35:23.279524 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277783 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:35:23.279524 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277786 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:35:23.279524 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277788 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:35:23.279524 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277792 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:35:23.279524 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277794 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:35:23.279524 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277797 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:35:23.279524 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277799 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:35:23.279524 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277802 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:35:23.279524 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277804 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:35:23.280028 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277806 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:35:23.280028 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277809 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:35:23.280028 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277817 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:35:23.280028 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277820 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:35:23.280028 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277823 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:35:23.280028 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277825 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:35:23.280028 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277827 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:35:23.280028 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277830 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:35:23.280028 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277832 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:35:23.280028 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277835 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:35:23.280028 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277838 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:35:23.280028 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277840 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:35:23.280028 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277843 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:35:23.280028 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277845 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:35:23.280028 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277848 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:35:23.280028 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277850 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:35:23.280028 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277853 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:35:23.280028 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277857 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:35:23.280028 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277860 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:35:23.280028 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277863 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:35:23.280565 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277866 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:35:23.280565 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277868 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:35:23.280565 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277871 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:35:23.280565 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277873 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:35:23.280565 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277876 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:35:23.280565 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277879 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:35:23.280565 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277881 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:35:23.280565 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277884 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:35:23.280565 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277886 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:35:23.280565 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277888 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:35:23.280565 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277893 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:35:23.280565 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277896 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:35:23.280565 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277899 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:35:23.280565 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277902 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:35:23.280565 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277904 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:35:23.280565 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277917 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:35:23.280565 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277920 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:35:23.280565 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277922 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:35:23.280565 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277924 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:35:23.281039 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277927 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:35:23.281039 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277929 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:35:23.281039 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277932 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:35:23.281039 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277934 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:35:23.281039 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277937 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:35:23.281039 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277939 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:35:23.281039 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277942 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:35:23.281039 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277944 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:35:23.281039 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277946 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:35:23.281039 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277949 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:35:23.281039 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277955 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:35:23.281039 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277958 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:35:23.281039 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.277960 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:35:23.281575 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281559 2569 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 15:35:23.281610 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281573 2569 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 15:35:23.281610 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281580 2569 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 15:35:23.281610 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281586 2569 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 15:35:23.281610 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281590 2569 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 15:35:23.281610 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281594 2569 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 15:35:23.281610 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281599 2569 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 15:35:23.281610 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281603 2569 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 15:35:23.281610 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281606 2569 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 15:35:23.281610 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281610 2569 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 15:35:23.281610 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281613 2569 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 15:35:23.281855 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281617 2569 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 15:35:23.281855 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281621 2569 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 15:35:23.281855 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281624 2569 flags.go:64] FLAG: --cgroup-root="" Apr 21 15:35:23.281855 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281627 2569 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 15:35:23.281855 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281630 2569 flags.go:64] FLAG: --client-ca-file="" Apr 21 15:35:23.281855 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281634 2569 flags.go:64] FLAG: --cloud-config="" Apr 21 15:35:23.281855 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281637 2569 flags.go:64] FLAG: --cloud-provider="external" Apr 21 15:35:23.281855 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281641 2569 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 15:35:23.281855 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281645 2569 flags.go:64] FLAG: --cluster-domain="" Apr 21 15:35:23.281855 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281648 2569 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 15:35:23.281855 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281651 2569 flags.go:64] FLAG: --config-dir="" Apr 21 15:35:23.281855 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281654 2569 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 15:35:23.281855 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281658 2569 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 15:35:23.281855 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281662 2569 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 15:35:23.281855 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281665 2569 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 15:35:23.281855 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281668 2569 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 15:35:23.281855 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281671 2569 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 15:35:23.281855 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281674 2569 flags.go:64] FLAG: --contention-profiling="false" Apr 21 15:35:23.281855 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281678 2569 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 15:35:23.281855 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281681 2569 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 15:35:23.281855 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281684 2569 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 15:35:23.281855 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281687 2569 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 15:35:23.281855 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281692 2569 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 15:35:23.281855 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281695 2569 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 15:35:23.281855 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281698 2569 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 15:35:23.282475 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281700 2569 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 15:35:23.282475 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281703 2569 flags.go:64] FLAG: --enable-server="true" Apr 21 15:35:23.282475 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281706 2569 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 15:35:23.282475 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281711 2569 flags.go:64] FLAG: --event-burst="100" Apr 21 15:35:23.282475 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281715 2569 flags.go:64] FLAG: --event-qps="50" Apr 21 15:35:23.282475 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281718 2569 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 15:35:23.282475 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281721 2569 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 15:35:23.282475 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281723 2569 flags.go:64] FLAG: --eviction-hard="" Apr 21 15:35:23.282475 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281728 2569 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 15:35:23.282475 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281731 2569 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 15:35:23.282475 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281735 2569 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 15:35:23.282475 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281737 2569 flags.go:64] FLAG: --eviction-soft="" Apr 21 15:35:23.282475 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281740 2569 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 15:35:23.282475 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281744 2569 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 15:35:23.282475 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281747 2569 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 15:35:23.282475 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281750 2569 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 15:35:23.282475 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281753 2569 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 15:35:23.282475 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281756 2569 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 15:35:23.282475 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281758 2569 flags.go:64] FLAG: --feature-gates="" Apr 21 15:35:23.282475 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281762 2569 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 15:35:23.282475 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281766 2569 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 15:35:23.282475 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281769 2569 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 15:35:23.282475 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281772 2569 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 15:35:23.282475 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281775 2569 flags.go:64] FLAG: --healthz-port="10248" Apr 21 15:35:23.282475 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281778 2569 flags.go:64] FLAG: --help="false" Apr 21 15:35:23.283080 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281782 2569 flags.go:64] FLAG: --hostname-override="ip-10-0-133-158.ec2.internal" Apr 21 15:35:23.283080 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281785 2569 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 15:35:23.283080 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281788 2569 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 15:35:23.283080 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281791 2569 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 15:35:23.283080 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281794 2569 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 15:35:23.283080 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281798 2569 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 15:35:23.283080 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281801 2569 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 15:35:23.283080 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281804 2569 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 15:35:23.283080 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281806 2569 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 15:35:23.283080 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281809 2569 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 15:35:23.283080 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281812 2569 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 15:35:23.283080 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281815 2569 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 15:35:23.283080 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281818 2569 flags.go:64] FLAG: --kube-reserved="" Apr 21 15:35:23.283080 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281821 2569 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 15:35:23.283080 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281824 2569 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 15:35:23.283080 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281828 2569 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 15:35:23.283080 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281831 2569 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 15:35:23.283080 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281833 2569 flags.go:64] FLAG: --lock-file="" Apr 21 15:35:23.283080 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281836 2569 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 15:35:23.283080 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281839 2569 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 15:35:23.283080 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281843 2569 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 15:35:23.283080 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281848 2569 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 15:35:23.283080 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281851 2569 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 15:35:23.283638 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281854 2569 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 15:35:23.283638 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281857 2569 flags.go:64] FLAG: --logging-format="text" Apr 21 15:35:23.283638 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281860 2569 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 15:35:23.283638 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281864 2569 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 15:35:23.283638 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281867 2569 flags.go:64] FLAG: --manifest-url="" Apr 21 15:35:23.283638 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281870 2569 flags.go:64] FLAG: --manifest-url-header="" Apr 21 15:35:23.283638 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281875 2569 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 15:35:23.283638 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281878 2569 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 15:35:23.283638 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281882 2569 flags.go:64] FLAG: --max-pods="110" Apr 21 15:35:23.283638 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281885 2569 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 15:35:23.283638 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281888 2569 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 15:35:23.283638 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281891 2569 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 15:35:23.283638 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281894 2569 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 15:35:23.283638 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281897 2569 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 15:35:23.283638 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281900 2569 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 15:35:23.283638 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281904 2569 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 15:35:23.283638 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281912 2569 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 15:35:23.283638 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281915 2569 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 15:35:23.283638 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281918 2569 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 15:35:23.283638 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281921 2569 flags.go:64] FLAG: --pod-cidr="" Apr 21 15:35:23.283638 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281924 2569 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 15:35:23.283638 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281931 2569 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 15:35:23.283638 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281933 2569 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 15:35:23.283638 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281937 2569 flags.go:64] FLAG: --pods-per-core="0" Apr 21 15:35:23.284213 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281940 2569 flags.go:64] FLAG: --port="10250" Apr 21 15:35:23.284213 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281944 2569 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 15:35:23.284213 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281947 2569 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0fd23bf6d45107a54" Apr 21 15:35:23.284213 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281950 2569 flags.go:64] FLAG: --qos-reserved="" Apr 21 15:35:23.284213 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281953 2569 flags.go:64] FLAG: --read-only-port="10255" Apr 21 15:35:23.284213 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281957 2569 flags.go:64] FLAG: --register-node="true" Apr 21 15:35:23.284213 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281959 2569 flags.go:64] FLAG: --register-schedulable="true" Apr 21 15:35:23.284213 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281963 2569 flags.go:64] FLAG: --register-with-taints="" Apr 21 15:35:23.284213 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281968 2569 flags.go:64] FLAG: --registry-burst="10" Apr 21 15:35:23.284213 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281971 2569 flags.go:64] FLAG: --registry-qps="5" Apr 21 15:35:23.284213 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281974 2569 flags.go:64] FLAG: --reserved-cpus="" Apr 21 15:35:23.284213 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281977 2569 flags.go:64] FLAG: --reserved-memory="" Apr 21 15:35:23.284213 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281981 2569 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 15:35:23.284213 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281984 2569 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 15:35:23.284213 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281987 2569 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 15:35:23.284213 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281990 2569 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 15:35:23.284213 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281993 2569 flags.go:64] FLAG: --runonce="false" Apr 21 15:35:23.284213 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.281997 2569 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 15:35:23.284213 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.282000 2569 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 15:35:23.284213 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.282004 2569 flags.go:64] FLAG: --seccomp-default="false" Apr 21 15:35:23.284213 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.282007 2569 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 15:35:23.284213 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.282010 2569 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 15:35:23.284213 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.282013 2569 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 15:35:23.284213 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.282016 2569 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 15:35:23.284213 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.282020 2569 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 15:35:23.284213 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.282023 2569 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 15:35:23.284842 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.282026 2569 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 15:35:23.284842 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.282029 2569 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 15:35:23.284842 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.282032 2569 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 15:35:23.284842 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.282035 2569 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 15:35:23.284842 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.282038 2569 flags.go:64] FLAG: --system-cgroups="" Apr 21 15:35:23.284842 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.282041 2569 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 15:35:23.284842 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.282046 2569 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 15:35:23.284842 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.282049 2569 flags.go:64] FLAG: --tls-cert-file="" Apr 21 15:35:23.284842 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.282052 2569 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 15:35:23.284842 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.282057 2569 flags.go:64] FLAG: --tls-min-version="" Apr 21 15:35:23.284842 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.282060 2569 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 15:35:23.284842 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.282063 2569 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 15:35:23.284842 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.282066 2569 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 15:35:23.284842 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.282069 2569 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 15:35:23.284842 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.282072 2569 flags.go:64] FLAG: --v="2" Apr 21 15:35:23.284842 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.282077 2569 flags.go:64] FLAG: --version="false" Apr 21 15:35:23.284842 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.282081 2569 flags.go:64] FLAG: --vmodule="" Apr 21 15:35:23.284842 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.282086 2569 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 15:35:23.284842 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.282089 2569 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 15:35:23.284842 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282227 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:35:23.284842 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282232 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:35:23.284842 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282235 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:35:23.284842 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282238 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:35:23.284842 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282241 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:35:23.285449 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282244 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:35:23.285449 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282246 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:35:23.285449 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282249 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:35:23.285449 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282252 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:35:23.285449 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282254 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:35:23.285449 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282257 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:35:23.285449 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282259 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:35:23.285449 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282262 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:35:23.285449 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282265 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:35:23.285449 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282267 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:35:23.285449 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282269 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:35:23.285449 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282273 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:35:23.285449 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282277 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:35:23.285449 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282279 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:35:23.285449 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282282 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:35:23.285449 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282285 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:35:23.285449 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282287 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:35:23.285449 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282290 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:35:23.285449 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282293 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:35:23.285959 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282295 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:35:23.285959 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282298 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:35:23.285959 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282300 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:35:23.285959 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282303 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:35:23.285959 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282311 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:35:23.285959 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282314 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:35:23.285959 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282316 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:35:23.285959 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282319 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:35:23.285959 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282321 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:35:23.285959 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282324 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:35:23.285959 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282326 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:35:23.285959 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282330 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:35:23.285959 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282334 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:35:23.285959 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282337 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:35:23.285959 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282340 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:35:23.285959 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282342 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:35:23.285959 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282345 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:35:23.285959 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282348 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:35:23.285959 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282350 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:35:23.286572 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282353 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:35:23.286572 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282356 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:35:23.286572 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282358 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:35:23.286572 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282360 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:35:23.286572 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282363 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:35:23.286572 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282365 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:35:23.286572 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282368 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:35:23.286572 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282370 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:35:23.286572 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282373 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:35:23.286572 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282375 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:35:23.286572 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282378 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:35:23.286572 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282381 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:35:23.286572 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282383 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:35:23.286572 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282385 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:35:23.286572 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282388 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:35:23.286572 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282391 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:35:23.286572 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282393 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:35:23.286572 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282396 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:35:23.286572 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282399 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:35:23.286572 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282402 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:35:23.287071 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282404 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:35:23.287071 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282407 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:35:23.287071 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282409 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:35:23.287071 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282412 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:35:23.287071 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282414 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:35:23.287071 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282417 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:35:23.287071 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282420 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:35:23.287071 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282422 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:35:23.287071 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282425 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:35:23.287071 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282427 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:35:23.287071 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282429 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:35:23.287071 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282432 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:35:23.287071 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282434 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:35:23.287071 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282437 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:35:23.287071 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282440 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:35:23.287071 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282442 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:35:23.287071 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282444 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:35:23.287071 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282447 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:35:23.287071 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282449 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:35:23.287071 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282452 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:35:23.287590 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282455 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:35:23.287590 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282457 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:35:23.287590 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.282460 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:35:23.287590 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.283107 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 15:35:23.290506 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.290485 2569 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 15:35:23.290562 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.290507 2569 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 15:35:23.290593 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290567 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:35:23.290593 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290573 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:35:23.290593 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290577 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:35:23.290593 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290580 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:35:23.290593 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290589 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:35:23.290593 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290593 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:35:23.290593 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290596 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:35:23.290766 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290601 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:35:23.290766 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290605 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:35:23.290766 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290608 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:35:23.290766 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290611 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:35:23.290766 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290614 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:35:23.290766 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290618 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:35:23.290766 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290621 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:35:23.290766 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290625 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:35:23.290766 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290627 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:35:23.290766 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290630 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:35:23.290766 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290632 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:35:23.290766 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290635 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:35:23.290766 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290637 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:35:23.290766 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290640 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:35:23.290766 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290643 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:35:23.290766 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290645 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:35:23.290766 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290647 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:35:23.290766 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290650 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:35:23.290766 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290653 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:35:23.291257 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290655 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:35:23.291257 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290658 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:35:23.291257 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290661 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:35:23.291257 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290664 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:35:23.291257 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290666 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:35:23.291257 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290669 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:35:23.291257 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290671 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:35:23.291257 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290674 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:35:23.291257 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290676 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:35:23.291257 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290679 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:35:23.291257 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290682 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:35:23.291257 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290690 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:35:23.291257 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290693 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:35:23.291257 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290695 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:35:23.291257 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290699 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:35:23.291257 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290701 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:35:23.291257 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290704 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:35:23.291257 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290706 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:35:23.291257 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290709 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:35:23.291257 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290712 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:35:23.291757 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290714 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:35:23.291757 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290717 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:35:23.291757 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290719 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:35:23.291757 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290722 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:35:23.291757 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290724 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:35:23.291757 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290726 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:35:23.291757 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290729 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:35:23.291757 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290732 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:35:23.291757 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290734 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:35:23.291757 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290736 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:35:23.291757 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290739 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:35:23.291757 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290742 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:35:23.291757 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290744 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:35:23.291757 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290747 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:35:23.291757 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290749 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:35:23.291757 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290751 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:35:23.291757 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290754 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:35:23.291757 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290756 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:35:23.291757 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290759 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:35:23.291757 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290761 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:35:23.292265 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290764 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:35:23.292265 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290766 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:35:23.292265 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290768 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:35:23.292265 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290772 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:35:23.292265 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290782 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:35:23.292265 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290785 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:35:23.292265 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290788 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:35:23.292265 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290791 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:35:23.292265 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290793 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:35:23.292265 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290796 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:35:23.292265 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290798 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:35:23.292265 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290801 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:35:23.292265 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290803 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:35:23.292265 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290806 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:35:23.292265 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290808 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:35:23.292265 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290811 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:35:23.292265 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290814 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:35:23.292265 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290816 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:35:23.292265 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290819 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:35:23.292265 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290821 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:35:23.293077 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.290826 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 15:35:23.293077 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290954 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:35:23.293077 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290959 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:35:23.293077 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290962 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:35:23.293077 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290965 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:35:23.293077 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290968 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:35:23.293077 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290970 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:35:23.293077 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290973 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:35:23.293077 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290976 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:35:23.293077 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290979 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:35:23.293077 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290981 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:35:23.293077 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290984 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:35:23.293077 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290986 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:35:23.293077 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290989 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:35:23.293077 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290992 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:35:23.293077 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290994 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:35:23.293511 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.290997 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:35:23.293511 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291006 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:35:23.293511 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291009 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:35:23.293511 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291013 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:35:23.293511 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291016 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:35:23.293511 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291018 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:35:23.293511 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291021 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:35:23.293511 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291024 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:35:23.293511 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291026 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:35:23.293511 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291029 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:35:23.293511 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291031 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:35:23.293511 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291034 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:35:23.293511 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291036 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:35:23.293511 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291039 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:35:23.293511 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291042 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:35:23.293511 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291044 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:35:23.293511 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291047 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:35:23.293511 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291049 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:35:23.293511 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291052 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:35:23.293511 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291054 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:35:23.294008 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291057 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:35:23.294008 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291061 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:35:23.294008 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291064 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:35:23.294008 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291067 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:35:23.294008 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291070 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:35:23.294008 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291073 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:35:23.294008 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291076 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:35:23.294008 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291078 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:35:23.294008 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291081 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:35:23.294008 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291084 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:35:23.294008 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291087 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:35:23.294008 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291089 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:35:23.294008 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291091 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:35:23.294008 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291094 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:35:23.294008 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291102 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:35:23.294008 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291106 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:35:23.294008 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291108 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:35:23.294008 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291111 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:35:23.294008 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291113 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:35:23.294478 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291116 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:35:23.294478 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291119 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:35:23.294478 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291122 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:35:23.294478 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291125 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:35:23.294478 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291146 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:35:23.294478 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291149 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:35:23.294478 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291152 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:35:23.294478 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291154 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:35:23.294478 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291157 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:35:23.294478 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291159 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:35:23.294478 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291162 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:35:23.294478 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291164 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:35:23.294478 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291167 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:35:23.294478 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291169 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:35:23.294478 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291171 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:35:23.294478 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291174 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:35:23.294478 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291176 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:35:23.294478 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291179 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:35:23.294478 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291181 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:35:23.294478 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291183 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:35:23.294956 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291186 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:35:23.294956 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291188 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:35:23.294956 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291191 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:35:23.294956 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291193 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:35:23.294956 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291195 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:35:23.294956 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291198 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:35:23.294956 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291200 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:35:23.294956 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291203 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:35:23.294956 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291213 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:35:23.294956 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291216 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:35:23.294956 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291218 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:35:23.294956 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:23.291221 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:35:23.294956 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.291227 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 15:35:23.294956 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.291859 2569 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 15:35:23.295320 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.295245 2569 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 15:35:23.296632 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.296617 2569 server.go:1019] "Starting client certificate rotation" Apr 21 15:35:23.296733 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.296716 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 15:35:23.296770 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.296761 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 15:35:23.330850 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.330822 2569 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 15:35:23.333504 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.333486 2569 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 15:35:23.344458 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.344440 2569 log.go:25] "Validated CRI v1 runtime API" Apr 21 15:35:23.351032 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.351014 2569 log.go:25] "Validated CRI v1 image API" Apr 21 15:35:23.352123 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.352109 2569 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 15:35:23.358013 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.357987 2569 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 a746b37f-e1b5-4d0c-ba9e-fb87ee9cdde8:/dev/nvme0n1p4 d39b03e8-a231-4333-822c-81437983b315:/dev/nvme0n1p3] Apr 21 15:35:23.358059 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.358014 2569 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 15:35:23.365742 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.365627 2569 manager.go:217] Machine: {Timestamp:2026-04-21 15:35:23.362033853 +0000 UTC m=+0.464485465 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3082186 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2cc10640305d3fe8cbe50786f1dd8d SystemUUID:ec2cc106-4030-5d3f-e8cb-e50786f1dd8d BootID:003a940a-c36e-4d78-8d9b-1d9b85602e6d Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:81:03:65:d9:77 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:81:03:65:d9:77 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:5e:07:72:a3:fe:91 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 15:35:23.365742 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.365732 2569 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 15:35:23.365874 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.365862 2569 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 15:35:23.367435 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.367208 2569 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 15:35:23.367838 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.367437 2569 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-158.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 15:35:23.367906 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.367852 2569 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 15:35:23.367906 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.367867 2569 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 15:35:23.367906 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.367886 2569 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 15:35:23.368875 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.368808 2569 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 15:35:23.369921 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.369906 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 21 15:35:23.370237 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.370226 2569 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 15:35:23.371747 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.371727 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 15:35:23.373277 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.373265 2569 kubelet.go:491] "Attempting to sync node with API server" Apr 21 15:35:23.373355 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.373281 2569 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 15:35:23.373355 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.373293 2569 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 15:35:23.373355 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.373302 2569 kubelet.go:397] "Adding apiserver pod source" Apr 21 15:35:23.373355 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.373311 2569 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 15:35:23.374467 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.374454 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 15:35:23.374532 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.374481 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 15:35:23.392793 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.392770 2569 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 15:35:23.394588 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.394574 2569 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 15:35:23.395918 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.395905 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 15:35:23.395963 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.395925 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 15:35:23.395963 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.395932 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 15:35:23.395963 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.395940 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 15:35:23.395963 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.395948 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 15:35:23.395963 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.395959 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 15:35:23.396150 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.395966 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 15:35:23.396150 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.395974 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 15:35:23.396150 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.395985 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 15:35:23.396150 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.395991 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 15:35:23.396150 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.396000 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 15:35:23.396150 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.396013 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 15:35:23.397048 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.397037 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 15:35:23.397048 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.397049 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 15:35:23.398428 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:23.398393 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 15:35:23.398475 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:23.398400 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-158.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 15:35:23.400801 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.400789 2569 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 15:35:23.400854 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.400826 2569 server.go:1295] "Started kubelet" Apr 21 15:35:23.400919 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.400896 2569 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 15:35:23.401013 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.400962 2569 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 15:35:23.401056 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.401045 2569 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 15:35:23.401896 ip-10-0-133-158 systemd[1]: Started Kubernetes Kubelet. Apr 21 15:35:23.403117 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.402968 2569 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 15:35:23.404651 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.404635 2569 server.go:317] "Adding debug handlers to kubelet server" Apr 21 15:35:23.406221 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.406202 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-smng5" Apr 21 15:35:23.407755 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.407734 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 15:35:23.408283 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.408264 2569 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 15:35:23.408422 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.408400 2569 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-133-158.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 15:35:23.408933 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.408886 2569 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 15:35:23.408933 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.408933 2569 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 15:35:23.409083 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.409022 2569 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 15:35:23.409083 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:23.408023 2569 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-158.ec2.internal.18a869339d4a4ef5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-158.ec2.internal,UID:ip-10-0-133-158.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-133-158.ec2.internal,},FirstTimestamp:2026-04-21 15:35:23.400802037 +0000 UTC m=+0.503253648,LastTimestamp:2026-04-21 15:35:23.400802037 +0000 UTC m=+0.503253648,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-158.ec2.internal,}" Apr 21 15:35:23.409083 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.409082 2569 reconstruct.go:97] "Volume reconstruction finished" Apr 21 15:35:23.409261 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.409089 2569 reconciler.go:26] "Reconciler: start to sync state" Apr 21 15:35:23.409346 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.409328 2569 factory.go:55] Registering systemd factory Apr 21 15:35:23.409533 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:23.409354 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-158.ec2.internal\" not found" Apr 21 15:35:23.409630 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.409610 2569 factory.go:223] Registration of the systemd container factory successfully Apr 21 15:35:23.410080 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.410066 2569 factory.go:153] Registering CRI-O factory Apr 21 15:35:23.410221 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.410209 2569 factory.go:223] Registration of the crio container factory successfully Apr 21 15:35:23.410408 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.410397 2569 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 15:35:23.410501 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.410493 2569 factory.go:103] Registering Raw factory Apr 21 15:35:23.410578 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.410571 2569 manager.go:1196] Started watching for new ooms in manager Apr 21 15:35:23.411561 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.411533 2569 manager.go:319] Starting recovery of all containers Apr 21 15:35:23.413489 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:23.413460 2569 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-133-158.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 21 15:35:23.413713 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:23.413685 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 21 15:35:23.416492 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.416330 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-smng5" Apr 21 15:35:23.425781 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.425755 2569 manager.go:324] Recovery completed Apr 21 15:35:23.430000 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.429985 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:35:23.432405 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.432388 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-158.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:35:23.432469 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.432418 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:35:23.432469 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.432430 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-158.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:35:23.432854 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.432830 2569 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 15:35:23.432854 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.432853 2569 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 15:35:23.432957 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.432871 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 21 15:35:23.434925 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.434912 2569 policy_none.go:49] "None policy: Start" Apr 21 15:35:23.434988 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.434936 2569 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 15:35:23.434988 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.434947 2569 state_mem.go:35] "Initializing new in-memory state store" Apr 21 15:35:23.467783 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.467764 2569 manager.go:341] "Starting Device Plugin manager" Apr 21 15:35:23.480303 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:23.467848 2569 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 15:35:23.480303 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.467867 2569 server.go:85] "Starting device plugin registration server" Apr 21 15:35:23.480303 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.468145 2569 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 15:35:23.480303 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.468162 2569 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 15:35:23.480303 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.468267 2569 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 15:35:23.480303 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.468358 2569 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 15:35:23.480303 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.468369 2569 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 15:35:23.480303 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:23.468948 2569 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 15:35:23.480303 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:23.468981 2569 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-158.ec2.internal\" not found" Apr 21 15:35:23.540381 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.540299 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 15:35:23.541603 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.541585 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 15:35:23.541675 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.541619 2569 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 15:35:23.541675 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.541645 2569 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 15:35:23.541675 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.541654 2569 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 15:35:23.541763 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:23.541691 2569 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 15:35:23.545079 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.545056 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:35:23.569003 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.568980 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:35:23.570194 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.570179 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-158.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:35:23.570264 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.570209 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:35:23.570264 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.570220 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-158.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:35:23.570264 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.570259 2569 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-158.ec2.internal" Apr 21 15:35:23.580817 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.580802 2569 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-158.ec2.internal" Apr 21 15:35:23.580874 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:23.580825 2569 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-158.ec2.internal\": node \"ip-10-0-133-158.ec2.internal\" not found" Apr 21 15:35:23.609460 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:23.609435 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-158.ec2.internal\" not found" Apr 21 15:35:23.642420 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.642395 2569 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-133-158.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-158.ec2.internal"] Apr 21 15:35:23.642522 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.642462 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:35:23.643464 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.643449 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-158.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:35:23.643531 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.643480 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:35:23.643531 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.643490 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-158.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:35:23.644560 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.644548 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:35:23.644706 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.644691 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-158.ec2.internal" Apr 21 15:35:23.644746 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.644725 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:35:23.645289 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.645274 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-158.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:35:23.645376 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.645299 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:35:23.645376 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.645311 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-158.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:35:23.645376 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.645277 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-158.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:35:23.645376 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.645368 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:35:23.645521 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.645381 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-158.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:35:23.646336 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.646324 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-158.ec2.internal" Apr 21 15:35:23.646384 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.646346 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:35:23.646956 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.646941 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-158.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:35:23.647021 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.646968 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:35:23.647021 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.646985 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-158.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:35:23.666661 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:23.666646 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-158.ec2.internal\" not found" node="ip-10-0-133-158.ec2.internal" Apr 21 15:35:23.671087 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:23.671071 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-158.ec2.internal\" not found" node="ip-10-0-133-158.ec2.internal" Apr 21 15:35:23.709808 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:23.709777 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-158.ec2.internal\" not found" Apr 21 15:35:23.710951 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.710933 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/346ecef21a63285028e6bf61503fc2e3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-158.ec2.internal\" (UID: \"346ecef21a63285028e6bf61503fc2e3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-158.ec2.internal" Apr 21 15:35:23.711025 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.710964 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/35017106fd99d25a848bf349caf9d842-config\") pod \"kube-apiserver-proxy-ip-10-0-133-158.ec2.internal\" (UID: \"35017106fd99d25a848bf349caf9d842\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-158.ec2.internal" Apr 21 15:35:23.711025 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.710989 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/346ecef21a63285028e6bf61503fc2e3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-158.ec2.internal\" (UID: \"346ecef21a63285028e6bf61503fc2e3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-158.ec2.internal" Apr 21 15:35:23.810705 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:23.810626 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-158.ec2.internal\" not found" Apr 21 15:35:23.811938 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.811918 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/346ecef21a63285028e6bf61503fc2e3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-158.ec2.internal\" (UID: \"346ecef21a63285028e6bf61503fc2e3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-158.ec2.internal" Apr 21 15:35:23.812026 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.811953 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/346ecef21a63285028e6bf61503fc2e3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-158.ec2.internal\" (UID: \"346ecef21a63285028e6bf61503fc2e3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-158.ec2.internal" Apr 21 15:35:23.812026 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.811979 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/35017106fd99d25a848bf349caf9d842-config\") pod \"kube-apiserver-proxy-ip-10-0-133-158.ec2.internal\" (UID: \"35017106fd99d25a848bf349caf9d842\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-158.ec2.internal" Apr 21 15:35:23.812026 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.812010 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/346ecef21a63285028e6bf61503fc2e3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-158.ec2.internal\" (UID: \"346ecef21a63285028e6bf61503fc2e3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-158.ec2.internal" Apr 21 15:35:23.812145 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.812036 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/346ecef21a63285028e6bf61503fc2e3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-158.ec2.internal\" (UID: \"346ecef21a63285028e6bf61503fc2e3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-158.ec2.internal" Apr 21 15:35:23.812145 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.812037 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/35017106fd99d25a848bf349caf9d842-config\") pod \"kube-apiserver-proxy-ip-10-0-133-158.ec2.internal\" (UID: \"35017106fd99d25a848bf349caf9d842\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-158.ec2.internal" Apr 21 15:35:23.911227 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:23.911201 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-158.ec2.internal\" not found" Apr 21 15:35:23.968495 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.968475 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-158.ec2.internal" Apr 21 15:35:23.973072 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:23.973039 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-158.ec2.internal" Apr 21 15:35:24.011940 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:24.011904 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-158.ec2.internal\" not found" Apr 21 15:35:24.112401 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:24.112320 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-158.ec2.internal\" not found" Apr 21 15:35:24.212789 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:24.212758 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-158.ec2.internal\" not found" Apr 21 15:35:24.296058 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:24.296025 2569 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 15:35:24.296699 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:24.296199 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 15:35:24.313358 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:24.313324 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-158.ec2.internal\" not found" Apr 21 15:35:24.408093 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:24.408030 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 15:35:24.414079 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:24.414057 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-158.ec2.internal\" not found" Apr 21 15:35:24.419027 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:24.418990 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 15:30:23 +0000 UTC" deadline="2027-09-27 12:50:58.312931693 +0000 UTC" Apr 21 15:35:24.419027 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:24.419020 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12573h15m33.893914221s" Apr 21 15:35:24.420796 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:24.420763 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35017106fd99d25a848bf349caf9d842.slice/crio-62a5114ca316cb41a186ad15df2040b296621721360eabb3f7eee3bf1b46ecd8 WatchSource:0}: Error finding container 62a5114ca316cb41a186ad15df2040b296621721360eabb3f7eee3bf1b46ecd8: Status 404 returned error can't find the container with id 62a5114ca316cb41a186ad15df2040b296621721360eabb3f7eee3bf1b46ecd8 Apr 21 15:35:24.421092 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:24.421073 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod346ecef21a63285028e6bf61503fc2e3.slice/crio-b449707b5ae455d7f99739a82207cb0042c4b5fc89e6ca922cb1727756a586af WatchSource:0}: Error finding container b449707b5ae455d7f99739a82207cb0042c4b5fc89e6ca922cb1727756a586af: Status 404 returned error can't find the container with id b449707b5ae455d7f99739a82207cb0042c4b5fc89e6ca922cb1727756a586af Apr 21 15:35:24.426031 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:24.426011 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:35:24.426980 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:24.426964 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 15:35:24.455445 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:24.455423 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-gshp7" Apr 21 15:35:24.464862 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:24.464838 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-gshp7" Apr 21 15:35:24.514926 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:24.514532 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-158.ec2.internal\" not found" Apr 21 15:35:24.537193 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:24.537170 2569 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:35:24.544919 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:24.544875 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-158.ec2.internal" event={"ID":"35017106fd99d25a848bf349caf9d842","Type":"ContainerStarted","Data":"62a5114ca316cb41a186ad15df2040b296621721360eabb3f7eee3bf1b46ecd8"} Apr 21 15:35:24.545706 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:24.545686 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-158.ec2.internal" event={"ID":"346ecef21a63285028e6bf61503fc2e3","Type":"ContainerStarted","Data":"b449707b5ae455d7f99739a82207cb0042c4b5fc89e6ca922cb1727756a586af"} Apr 21 15:35:24.595354 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:24.595329 2569 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:35:24.609472 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:24.609454 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-158.ec2.internal" Apr 21 15:35:24.623584 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:24.623566 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 15:35:24.624471 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:24.624459 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-158.ec2.internal" Apr 21 15:35:24.635627 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:24.635611 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 15:35:24.720037 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:24.720015 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:35:25.374656 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.374622 2569 apiserver.go:52] "Watching apiserver" Apr 21 15:35:25.381647 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.381621 2569 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 15:35:25.382018 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.381995 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-ll86t","openshift-ovn-kubernetes/ovnkube-node-xwc9h","kube-system/kube-apiserver-proxy-ip-10-0-133-158.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8tjgk","openshift-multus/multus-84m4w","openshift-multus/network-metrics-daemon-8tknf","openshift-network-diagnostics/network-check-target-4vnh5","openshift-network-operator/iptables-alerter-qjvz6","kube-system/konnectivity-agent-zptw6","openshift-cluster-node-tuning-operator/tuned-x785v","openshift-image-registry/node-ca-94bl8","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-158.ec2.internal"] Apr 21 15:35:25.383513 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.383490 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ll86t" Apr 21 15:35:25.384654 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.384634 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.385943 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.385923 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8tjgk" Apr 21 15:35:25.386172 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.386151 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 15:35:25.386682 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.386602 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 15:35:25.386682 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.386665 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 15:35:25.386682 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.386672 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-62wlx\"" Apr 21 15:35:25.386873 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.386711 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 15:35:25.386873 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.386798 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-p5zgv\"" Apr 21 15:35:25.387005 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.386990 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 15:35:25.387293 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.387143 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 15:35:25.387293 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.387147 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 15:35:25.387293 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.387229 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.387583 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.387568 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 15:35:25.387628 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.387607 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 15:35:25.387672 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.387627 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 15:35:25.387672 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.387644 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 15:35:25.388571 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.388554 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tknf" Apr 21 15:35:25.388662 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:25.388625 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tknf" podUID="0e013988-1283-4f21-89bb-0200deb14502" Apr 21 15:35:25.389891 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.389871 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4vnh5" Apr 21 15:35:25.389996 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:25.389944 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4vnh5" podUID="4e63b458-1e99-4bb3-bb96-259b69e04282" Apr 21 15:35:25.390186 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.390166 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 15:35:25.390266 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.390197 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 15:35:25.390322 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.390293 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-mlqqg\"" Apr 21 15:35:25.390411 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.390396 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 15:35:25.390493 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.390477 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-9brrf\"" Apr 21 15:35:25.390590 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.390511 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 15:35:25.391196 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.391177 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qjvz6" Apr 21 15:35:25.392656 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.392530 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-zptw6" Apr 21 15:35:25.393859 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.393833 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:35:25.394034 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.394005 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 15:35:25.394110 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.394092 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 15:35:25.394252 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.394232 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-mdsh5\"" Apr 21 15:35:25.395349 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.395329 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-jtnbc\"" Apr 21 15:35:25.395481 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.395463 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 15:35:25.395676 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.395650 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 15:35:25.396830 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.395927 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.397480 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.397461 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-94bl8" Apr 21 15:35:25.398378 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.398359 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:35:25.398479 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.398399 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-csxqg\"" Apr 21 15:35:25.398479 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.398418 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 15:35:25.399740 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.399726 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 15:35:25.399883 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.399740 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 15:35:25.399953 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.399797 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 15:35:25.400260 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.400244 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-pp4l5\"" Apr 21 15:35:25.410376 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.410359 2569 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 15:35:25.419685 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.419653 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8e625158-0ce1-4766-9988-80be7fb8ed12-cnibin\") pod \"multus-additional-cni-plugins-ll86t\" (UID: \"8e625158-0ce1-4766-9988-80be7fb8ed12\") " pod="openshift-multus/multus-additional-cni-plugins-ll86t" Apr 21 15:35:25.419794 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.419705 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e013988-1283-4f21-89bb-0200deb14502-metrics-certs\") pod \"network-metrics-daemon-8tknf\" (UID: \"0e013988-1283-4f21-89bb-0200deb14502\") " pod="openshift-multus/network-metrics-daemon-8tknf" Apr 21 15:35:25.419794 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.419732 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-etc-openvswitch\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.419794 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.419748 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-node-log\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.419794 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.419770 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-ovn-node-metrics-cert\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.419968 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.419796 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-ovnkube-script-lib\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.419968 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.419833 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/53c1515a-d317-44a6-a294-1a76f1166ce9-sys-fs\") pod \"aws-ebs-csi-driver-node-8tjgk\" (UID: \"53c1515a-d317-44a6-a294-1a76f1166ce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8tjgk" Apr 21 15:35:25.419968 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.419874 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9qf9\" (UniqueName: \"kubernetes.io/projected/4e63b458-1e99-4bb3-bb96-259b69e04282-kube-api-access-c9qf9\") pod \"network-check-target-4vnh5\" (UID: \"4e63b458-1e99-4bb3-bb96-259b69e04282\") " pod="openshift-network-diagnostics/network-check-target-4vnh5" Apr 21 15:35:25.419968 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.419900 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/46ce2483-51db-443d-8cbc-8d669c012502-etc-sysconfig\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.419968 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.419921 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/46ce2483-51db-443d-8cbc-8d669c012502-run\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.419968 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.419945 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-host-cni-bin\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.420278 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.419993 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-multus-cni-dir\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.420278 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420028 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-cni-binary-copy\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.420278 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420054 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rqqs\" (UniqueName: \"kubernetes.io/projected/8e625158-0ce1-4766-9988-80be7fb8ed12-kube-api-access-4rqqs\") pod \"multus-additional-cni-plugins-ll86t\" (UID: \"8e625158-0ce1-4766-9988-80be7fb8ed12\") " pod="openshift-multus/multus-additional-cni-plugins-ll86t" Apr 21 15:35:25.420278 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420079 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/46ce2483-51db-443d-8cbc-8d669c012502-host\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.420278 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420101 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s494h\" (UniqueName: \"kubernetes.io/projected/46ce2483-51db-443d-8cbc-8d669c012502-kube-api-access-s494h\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.420278 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420155 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-host-kubelet\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.420278 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420180 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-run-systemd\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.420278 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420203 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-cnibin\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.420278 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420223 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0187917a-3b91-4173-9632-8211e2adc77e-serviceca\") pod \"node-ca-94bl8\" (UID: \"0187917a-3b91-4173-9632-8211e2adc77e\") " pod="openshift-image-registry/node-ca-94bl8" Apr 21 15:35:25.420278 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420245 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7sz5\" (UniqueName: \"kubernetes.io/projected/0187917a-3b91-4173-9632-8211e2adc77e-kube-api-access-n7sz5\") pod \"node-ca-94bl8\" (UID: \"0187917a-3b91-4173-9632-8211e2adc77e\") " pod="openshift-image-registry/node-ca-94bl8" Apr 21 15:35:25.420278 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420267 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/46ce2483-51db-443d-8cbc-8d669c012502-etc-tuned\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.420762 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420290 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8e625158-0ce1-4766-9988-80be7fb8ed12-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ll86t\" (UID: \"8e625158-0ce1-4766-9988-80be7fb8ed12\") " pod="openshift-multus/multus-additional-cni-plugins-ll86t" Apr 21 15:35:25.420762 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420321 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvxv9\" (UniqueName: \"kubernetes.io/projected/53c1515a-d317-44a6-a294-1a76f1166ce9-kube-api-access-kvxv9\") pod \"aws-ebs-csi-driver-node-8tjgk\" (UID: \"53c1515a-d317-44a6-a294-1a76f1166ce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8tjgk" Apr 21 15:35:25.420762 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420345 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/46ce2483-51db-443d-8cbc-8d669c012502-etc-sysctl-d\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.420762 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420368 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-host-var-lib-cni-bin\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.420762 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420405 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-multus-daemon-config\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.420762 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420428 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6423248e-635c-46da-980c-a175f07c835b-konnectivity-ca\") pod \"konnectivity-agent-zptw6\" (UID: \"6423248e-635c-46da-980c-a175f07c835b\") " pod="kube-system/konnectivity-agent-zptw6" Apr 21 15:35:25.420762 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420461 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-systemd-units\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.420762 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420484 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-var-lib-openvswitch\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.420762 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420509 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-system-cni-dir\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.420762 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420532 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-host-var-lib-cni-multus\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.420762 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420550 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-hostroot\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.420762 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420564 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-host-slash\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.420762 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420587 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8e625158-0ce1-4766-9988-80be7fb8ed12-os-release\") pod \"multus-additional-cni-plugins-ll86t\" (UID: \"8e625158-0ce1-4766-9988-80be7fb8ed12\") " pod="openshift-multus/multus-additional-cni-plugins-ll86t" Apr 21 15:35:25.420762 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420610 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/53c1515a-d317-44a6-a294-1a76f1166ce9-device-dir\") pod \"aws-ebs-csi-driver-node-8tjgk\" (UID: \"53c1515a-d317-44a6-a294-1a76f1166ce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8tjgk" Apr 21 15:35:25.420762 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420625 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-run-ovn\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.420762 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420639 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8e625158-0ce1-4766-9988-80be7fb8ed12-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ll86t\" (UID: \"8e625158-0ce1-4766-9988-80be7fb8ed12\") " pod="openshift-multus/multus-additional-cni-plugins-ll86t" Apr 21 15:35:25.421462 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420659 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/53c1515a-d317-44a6-a294-1a76f1166ce9-socket-dir\") pod \"aws-ebs-csi-driver-node-8tjgk\" (UID: \"53c1515a-d317-44a6-a294-1a76f1166ce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8tjgk" Apr 21 15:35:25.421462 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420682 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-multus-socket-dir-parent\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.421462 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420704 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-host-run-netns\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.421462 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420726 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8e625158-0ce1-4766-9988-80be7fb8ed12-cni-binary-copy\") pod \"multus-additional-cni-plugins-ll86t\" (UID: \"8e625158-0ce1-4766-9988-80be7fb8ed12\") " pod="openshift-multus/multus-additional-cni-plugins-ll86t" Apr 21 15:35:25.421462 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420749 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/46ce2483-51db-443d-8cbc-8d669c012502-var-lib-kubelet\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.421462 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420773 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8e625158-0ce1-4766-9988-80be7fb8ed12-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ll86t\" (UID: \"8e625158-0ce1-4766-9988-80be7fb8ed12\") " pod="openshift-multus/multus-additional-cni-plugins-ll86t" Apr 21 15:35:25.421462 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420798 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/33966c59-2c6e-40f4-a1b7-138392e36585-host-slash\") pod \"iptables-alerter-qjvz6\" (UID: \"33966c59-2c6e-40f4-a1b7-138392e36585\") " pod="openshift-network-operator/iptables-alerter-qjvz6" Apr 21 15:35:25.421462 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420830 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/46ce2483-51db-443d-8cbc-8d669c012502-etc-sysctl-conf\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.421462 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420858 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/46ce2483-51db-443d-8cbc-8d669c012502-tmp\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.421462 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420885 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-host-run-ovn-kubernetes\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.421462 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420910 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-ovnkube-config\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.421462 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420946 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53c1515a-d317-44a6-a294-1a76f1166ce9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8tjgk\" (UID: \"53c1515a-d317-44a6-a294-1a76f1166ce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8tjgk" Apr 21 15:35:25.421462 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420972 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/53c1515a-d317-44a6-a294-1a76f1166ce9-etc-selinux\") pod \"aws-ebs-csi-driver-node-8tjgk\" (UID: \"53c1515a-d317-44a6-a294-1a76f1166ce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8tjgk" Apr 21 15:35:25.421462 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.420998 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-log-socket\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.421462 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.421038 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-multus-conf-dir\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.421462 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.421098 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nwj7\" (UniqueName: \"kubernetes.io/projected/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-kube-api-access-7nwj7\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.422283 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.421150 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-run-openvswitch\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.422283 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.421192 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/33966c59-2c6e-40f4-a1b7-138392e36585-iptables-alerter-script\") pod \"iptables-alerter-qjvz6\" (UID: \"33966c59-2c6e-40f4-a1b7-138392e36585\") " pod="openshift-network-operator/iptables-alerter-qjvz6" Apr 21 15:35:25.422283 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.421237 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6423248e-635c-46da-980c-a175f07c835b-agent-certs\") pod \"konnectivity-agent-zptw6\" (UID: \"6423248e-635c-46da-980c-a175f07c835b\") " pod="kube-system/konnectivity-agent-zptw6" Apr 21 15:35:25.422283 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.421292 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e625158-0ce1-4766-9988-80be7fb8ed12-system-cni-dir\") pod \"multus-additional-cni-plugins-ll86t\" (UID: \"8e625158-0ce1-4766-9988-80be7fb8ed12\") " pod="openshift-multus/multus-additional-cni-plugins-ll86t" Apr 21 15:35:25.422283 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.421330 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g78hj\" (UniqueName: \"kubernetes.io/projected/0e013988-1283-4f21-89bb-0200deb14502-kube-api-access-g78hj\") pod \"network-metrics-daemon-8tknf\" (UID: \"0e013988-1283-4f21-89bb-0200deb14502\") " pod="openshift-multus/network-metrics-daemon-8tknf" Apr 21 15:35:25.422283 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.421383 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-host-run-netns\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.422283 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.421409 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-os-release\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.422283 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.421446 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-host-var-lib-kubelet\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.422283 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.421470 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/53c1515a-d317-44a6-a294-1a76f1166ce9-registration-dir\") pod \"aws-ebs-csi-driver-node-8tjgk\" (UID: \"53c1515a-d317-44a6-a294-1a76f1166ce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8tjgk" Apr 21 15:35:25.422283 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.421501 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/46ce2483-51db-443d-8cbc-8d669c012502-sys\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.422283 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.421522 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-host-cni-netd\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.422283 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.421539 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-host-run-multus-certs\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.422283 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.421570 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-etc-kubernetes\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.422283 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.421602 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvq9j\" (UniqueName: \"kubernetes.io/projected/33966c59-2c6e-40f4-a1b7-138392e36585-kube-api-access-xvq9j\") pod \"iptables-alerter-qjvz6\" (UID: \"33966c59-2c6e-40f4-a1b7-138392e36585\") " pod="openshift-network-operator/iptables-alerter-qjvz6" Apr 21 15:35:25.422283 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.421632 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/46ce2483-51db-443d-8cbc-8d669c012502-etc-modprobe-d\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.422283 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.421664 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/46ce2483-51db-443d-8cbc-8d669c012502-lib-modules\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.423006 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.421714 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.423006 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.421745 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-host-run-k8s-cni-cncf-io\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.423006 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.421785 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/46ce2483-51db-443d-8cbc-8d669c012502-etc-kubernetes\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.423006 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.421815 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/46ce2483-51db-443d-8cbc-8d669c012502-etc-systemd\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.423006 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.421845 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-env-overrides\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.423006 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.421875 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66gzv\" (UniqueName: \"kubernetes.io/projected/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-kube-api-access-66gzv\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.423006 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.421910 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0187917a-3b91-4173-9632-8211e2adc77e-host\") pod \"node-ca-94bl8\" (UID: \"0187917a-3b91-4173-9632-8211e2adc77e\") " pod="openshift-image-registry/node-ca-94bl8" Apr 21 15:35:25.466406 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.466370 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 15:30:24 +0000 UTC" deadline="2027-12-06 18:24:57.363541541 +0000 UTC" Apr 21 15:35:25.466406 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.466397 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14258h49m31.897146444s" Apr 21 15:35:25.522430 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.522393 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-host-run-netns\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.522612 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.522439 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-os-release\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.522612 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.522484 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-host-run-netns\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.522612 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.522467 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-host-var-lib-kubelet\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.522612 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.522520 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-host-var-lib-kubelet\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.522612 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.522539 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/53c1515a-d317-44a6-a294-1a76f1166ce9-registration-dir\") pod \"aws-ebs-csi-driver-node-8tjgk\" (UID: \"53c1515a-d317-44a6-a294-1a76f1166ce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8tjgk" Apr 21 15:35:25.522612 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.522562 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/46ce2483-51db-443d-8cbc-8d669c012502-sys\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.522612 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.522588 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-host-cni-netd\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.522612 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.522593 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-os-release\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.522612 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.522614 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-host-run-multus-certs\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.523040 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.522641 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/46ce2483-51db-443d-8cbc-8d669c012502-sys\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.523040 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.522640 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-etc-kubernetes\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.523040 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.522689 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/53c1515a-d317-44a6-a294-1a76f1166ce9-registration-dir\") pod \"aws-ebs-csi-driver-node-8tjgk\" (UID: \"53c1515a-d317-44a6-a294-1a76f1166ce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8tjgk" Apr 21 15:35:25.523040 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.522689 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xvq9j\" (UniqueName: \"kubernetes.io/projected/33966c59-2c6e-40f4-a1b7-138392e36585-kube-api-access-xvq9j\") pod \"iptables-alerter-qjvz6\" (UID: \"33966c59-2c6e-40f4-a1b7-138392e36585\") " pod="openshift-network-operator/iptables-alerter-qjvz6" Apr 21 15:35:25.523040 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.522728 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/46ce2483-51db-443d-8cbc-8d669c012502-etc-modprobe-d\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.523040 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.522729 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-host-run-multus-certs\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.523040 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.522765 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-etc-kubernetes\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.523040 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.522785 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-host-cni-netd\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.523040 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.522791 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/46ce2483-51db-443d-8cbc-8d669c012502-lib-modules\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.523040 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.522847 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.523040 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.522881 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-host-run-k8s-cni-cncf-io\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.523040 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.522888 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/46ce2483-51db-443d-8cbc-8d669c012502-lib-modules\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.523040 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.522908 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/46ce2483-51db-443d-8cbc-8d669c012502-etc-kubernetes\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.523040 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.522934 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/46ce2483-51db-443d-8cbc-8d669c012502-etc-systemd\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.523040 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.522958 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-env-overrides\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.523040 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.522971 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/46ce2483-51db-443d-8cbc-8d669c012502-etc-modprobe-d\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.523040 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.522984 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-66gzv\" (UniqueName: \"kubernetes.io/projected/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-kube-api-access-66gzv\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.523827 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.523004 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-host-run-k8s-cni-cncf-io\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.523827 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.523012 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0187917a-3b91-4173-9632-8211e2adc77e-host\") pod \"node-ca-94bl8\" (UID: \"0187917a-3b91-4173-9632-8211e2adc77e\") " pod="openshift-image-registry/node-ca-94bl8" Apr 21 15:35:25.523827 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.523048 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/46ce2483-51db-443d-8cbc-8d669c012502-etc-systemd\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.523827 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.523061 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8e625158-0ce1-4766-9988-80be7fb8ed12-cnibin\") pod \"multus-additional-cni-plugins-ll86t\" (UID: \"8e625158-0ce1-4766-9988-80be7fb8ed12\") " pod="openshift-multus/multus-additional-cni-plugins-ll86t" Apr 21 15:35:25.523827 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.523053 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0187917a-3b91-4173-9632-8211e2adc77e-host\") pod \"node-ca-94bl8\" (UID: \"0187917a-3b91-4173-9632-8211e2adc77e\") " pod="openshift-image-registry/node-ca-94bl8" Apr 21 15:35:25.523827 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.523062 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.523827 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.523092 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e013988-1283-4f21-89bb-0200deb14502-metrics-certs\") pod \"network-metrics-daemon-8tknf\" (UID: \"0e013988-1283-4f21-89bb-0200deb14502\") " pod="openshift-multus/network-metrics-daemon-8tknf" Apr 21 15:35:25.523827 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.523122 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-etc-openvswitch\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.523827 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.523162 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-node-log\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.523827 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.523167 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8e625158-0ce1-4766-9988-80be7fb8ed12-cnibin\") pod \"multus-additional-cni-plugins-ll86t\" (UID: \"8e625158-0ce1-4766-9988-80be7fb8ed12\") " pod="openshift-multus/multus-additional-cni-plugins-ll86t" Apr 21 15:35:25.523827 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.523186 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-ovn-node-metrics-cert\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.523827 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.523212 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-ovnkube-script-lib\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.523827 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.523220 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-node-log\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.523827 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:25.523243 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:25.523827 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.523262 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/53c1515a-d317-44a6-a294-1a76f1166ce9-sys-fs\") pod \"aws-ebs-csi-driver-node-8tjgk\" (UID: \"53c1515a-d317-44a6-a294-1a76f1166ce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8tjgk" Apr 21 15:35:25.523827 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.523290 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/46ce2483-51db-443d-8cbc-8d669c012502-etc-kubernetes\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.523827 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.523263 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-etc-openvswitch\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.523827 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.523294 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9qf9\" (UniqueName: \"kubernetes.io/projected/4e63b458-1e99-4bb3-bb96-259b69e04282-kube-api-access-c9qf9\") pod \"network-check-target-4vnh5\" (UID: \"4e63b458-1e99-4bb3-bb96-259b69e04282\") " pod="openshift-network-diagnostics/network-check-target-4vnh5" Apr 21 15:35:25.524665 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.523370 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/53c1515a-d317-44a6-a294-1a76f1166ce9-sys-fs\") pod \"aws-ebs-csi-driver-node-8tjgk\" (UID: \"53c1515a-d317-44a6-a294-1a76f1166ce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8tjgk" Apr 21 15:35:25.524665 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.523383 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/46ce2483-51db-443d-8cbc-8d669c012502-etc-sysconfig\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.524665 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.523491 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/46ce2483-51db-443d-8cbc-8d669c012502-run\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.524665 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.523525 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-host-cni-bin\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.524665 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.523533 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/46ce2483-51db-443d-8cbc-8d669c012502-etc-sysconfig\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.524665 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.524283 2569 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 15:35:25.524665 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.524426 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-multus-cni-dir\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.524665 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.524529 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-cni-binary-copy\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.524665 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.524563 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4rqqs\" (UniqueName: \"kubernetes.io/projected/8e625158-0ce1-4766-9988-80be7fb8ed12-kube-api-access-4rqqs\") pod \"multus-additional-cni-plugins-ll86t\" (UID: \"8e625158-0ce1-4766-9988-80be7fb8ed12\") " pod="openshift-multus/multus-additional-cni-plugins-ll86t" Apr 21 15:35:25.524665 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.524634 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/46ce2483-51db-443d-8cbc-8d669c012502-host\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.525142 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.524667 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s494h\" (UniqueName: \"kubernetes.io/projected/46ce2483-51db-443d-8cbc-8d669c012502-kube-api-access-s494h\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.525142 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.524760 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-host-kubelet\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.525142 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.524793 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-run-systemd\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.525142 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.524819 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-cnibin\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.525142 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.525070 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-ovnkube-script-lib\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.525142 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.525090 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-env-overrides\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.525142 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.525090 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-multus-cni-dir\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.525456 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.525192 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/46ce2483-51db-443d-8cbc-8d669c012502-run\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.525456 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.525197 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-run-systemd\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.525456 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.525244 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-host-kubelet\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.525456 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.525274 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-cnibin\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.525456 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.525353 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/46ce2483-51db-443d-8cbc-8d669c012502-host\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.525681 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.525529 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0187917a-3b91-4173-9632-8211e2adc77e-serviceca\") pod \"node-ca-94bl8\" (UID: \"0187917a-3b91-4173-9632-8211e2adc77e\") " pod="openshift-image-registry/node-ca-94bl8" Apr 21 15:35:25.525681 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.525578 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7sz5\" (UniqueName: \"kubernetes.io/projected/0187917a-3b91-4173-9632-8211e2adc77e-kube-api-access-n7sz5\") pod \"node-ca-94bl8\" (UID: \"0187917a-3b91-4173-9632-8211e2adc77e\") " pod="openshift-image-registry/node-ca-94bl8" Apr 21 15:35:25.525681 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.525531 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-host-cni-bin\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.525681 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.525602 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0187917a-3b91-4173-9632-8211e2adc77e-serviceca\") pod \"node-ca-94bl8\" (UID: \"0187917a-3b91-4173-9632-8211e2adc77e\") " pod="openshift-image-registry/node-ca-94bl8" Apr 21 15:35:25.525681 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.525668 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/46ce2483-51db-443d-8cbc-8d669c012502-etc-tuned\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.525895 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:25.525733 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e013988-1283-4f21-89bb-0200deb14502-metrics-certs podName:0e013988-1283-4f21-89bb-0200deb14502 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:26.025696964 +0000 UTC m=+3.128148563 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e013988-1283-4f21-89bb-0200deb14502-metrics-certs") pod "network-metrics-daemon-8tknf" (UID: "0e013988-1283-4f21-89bb-0200deb14502") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:25.525895 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.525782 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8e625158-0ce1-4766-9988-80be7fb8ed12-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ll86t\" (UID: \"8e625158-0ce1-4766-9988-80be7fb8ed12\") " pod="openshift-multus/multus-additional-cni-plugins-ll86t" Apr 21 15:35:25.525895 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.525831 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kvxv9\" (UniqueName: \"kubernetes.io/projected/53c1515a-d317-44a6-a294-1a76f1166ce9-kube-api-access-kvxv9\") pod \"aws-ebs-csi-driver-node-8tjgk\" (UID: \"53c1515a-d317-44a6-a294-1a76f1166ce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8tjgk" Apr 21 15:35:25.525895 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.525875 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/46ce2483-51db-443d-8cbc-8d669c012502-etc-sysctl-d\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.526084 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.525923 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-host-var-lib-cni-bin\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.526084 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.525960 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-multus-daemon-config\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.526200 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.526073 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6423248e-635c-46da-980c-a175f07c835b-konnectivity-ca\") pod \"konnectivity-agent-zptw6\" (UID: \"6423248e-635c-46da-980c-a175f07c835b\") " pod="kube-system/konnectivity-agent-zptw6" Apr 21 15:35:25.526200 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.526173 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-systemd-units\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.526294 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.526202 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-var-lib-openvswitch\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.526294 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.526231 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-system-cni-dir\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.526294 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.526258 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-host-var-lib-cni-multus\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.526431 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.526286 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-hostroot\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.526431 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.526350 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-host-slash\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.526527 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.526469 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8e625158-0ce1-4766-9988-80be7fb8ed12-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ll86t\" (UID: \"8e625158-0ce1-4766-9988-80be7fb8ed12\") " pod="openshift-multus/multus-additional-cni-plugins-ll86t" Apr 21 15:35:25.526527 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.526479 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-systemd-units\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.526627 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.526522 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8e625158-0ce1-4766-9988-80be7fb8ed12-os-release\") pod \"multus-additional-cni-plugins-ll86t\" (UID: \"8e625158-0ce1-4766-9988-80be7fb8ed12\") " pod="openshift-multus/multus-additional-cni-plugins-ll86t" Apr 21 15:35:25.526627 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.526566 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-host-slash\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.526627 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.526606 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/53c1515a-d317-44a6-a294-1a76f1166ce9-device-dir\") pod \"aws-ebs-csi-driver-node-8tjgk\" (UID: \"53c1515a-d317-44a6-a294-1a76f1166ce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8tjgk" Apr 21 15:35:25.526627 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.526612 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/46ce2483-51db-443d-8cbc-8d669c012502-etc-sysctl-d\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.526810 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.526628 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-var-lib-openvswitch\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.526810 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.526654 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-run-ovn\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.526810 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.526668 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/53c1515a-d317-44a6-a294-1a76f1166ce9-device-dir\") pod \"aws-ebs-csi-driver-node-8tjgk\" (UID: \"53c1515a-d317-44a6-a294-1a76f1166ce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8tjgk" Apr 21 15:35:25.526810 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.526686 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-system-cni-dir\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.526810 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.526732 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-host-var-lib-cni-multus\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.526810 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.526743 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8e625158-0ce1-4766-9988-80be7fb8ed12-os-release\") pod \"multus-additional-cni-plugins-ll86t\" (UID: \"8e625158-0ce1-4766-9988-80be7fb8ed12\") " pod="openshift-multus/multus-additional-cni-plugins-ll86t" Apr 21 15:35:25.526810 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.526801 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-hostroot\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.527108 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.526812 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-run-ovn\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.527108 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.526896 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-host-var-lib-cni-bin\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.527108 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.526934 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8e625158-0ce1-4766-9988-80be7fb8ed12-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ll86t\" (UID: \"8e625158-0ce1-4766-9988-80be7fb8ed12\") " pod="openshift-multus/multus-additional-cni-plugins-ll86t" Apr 21 15:35:25.527108 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.526994 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/53c1515a-d317-44a6-a294-1a76f1166ce9-socket-dir\") pod \"aws-ebs-csi-driver-node-8tjgk\" (UID: \"53c1515a-d317-44a6-a294-1a76f1166ce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8tjgk" Apr 21 15:35:25.527108 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.527037 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-multus-socket-dir-parent\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.527108 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.527073 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-host-run-netns\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.527108 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.527106 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8e625158-0ce1-4766-9988-80be7fb8ed12-cni-binary-copy\") pod \"multus-additional-cni-plugins-ll86t\" (UID: \"8e625158-0ce1-4766-9988-80be7fb8ed12\") " pod="openshift-multus/multus-additional-cni-plugins-ll86t" Apr 21 15:35:25.527437 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.527146 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/46ce2483-51db-443d-8cbc-8d669c012502-var-lib-kubelet\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.527437 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.527205 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8e625158-0ce1-4766-9988-80be7fb8ed12-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ll86t\" (UID: \"8e625158-0ce1-4766-9988-80be7fb8ed12\") " pod="openshift-multus/multus-additional-cni-plugins-ll86t" Apr 21 15:35:25.527437 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.527245 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/33966c59-2c6e-40f4-a1b7-138392e36585-host-slash\") pod \"iptables-alerter-qjvz6\" (UID: \"33966c59-2c6e-40f4-a1b7-138392e36585\") " pod="openshift-network-operator/iptables-alerter-qjvz6" Apr 21 15:35:25.527437 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.527277 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/46ce2483-51db-443d-8cbc-8d669c012502-etc-sysctl-conf\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.527437 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.527317 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/46ce2483-51db-443d-8cbc-8d669c012502-tmp\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.527437 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.527344 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-host-run-ovn-kubernetes\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.527437 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.527396 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-ovnkube-config\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.527437 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.527417 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-multus-daemon-config\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.527808 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.527446 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53c1515a-d317-44a6-a294-1a76f1166ce9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8tjgk\" (UID: \"53c1515a-d317-44a6-a294-1a76f1166ce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8tjgk" Apr 21 15:35:25.527808 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.527489 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/53c1515a-d317-44a6-a294-1a76f1166ce9-etc-selinux\") pod \"aws-ebs-csi-driver-node-8tjgk\" (UID: \"53c1515a-d317-44a6-a294-1a76f1166ce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8tjgk" Apr 21 15:35:25.527808 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.527523 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-log-socket\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.527808 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.527547 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-multus-conf-dir\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.527808 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.527595 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7nwj7\" (UniqueName: \"kubernetes.io/projected/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-kube-api-access-7nwj7\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.527808 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.527647 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-run-openvswitch\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.527808 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.527679 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8e625158-0ce1-4766-9988-80be7fb8ed12-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ll86t\" (UID: \"8e625158-0ce1-4766-9988-80be7fb8ed12\") " pod="openshift-multus/multus-additional-cni-plugins-ll86t" Apr 21 15:35:25.528106 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.527710 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/33966c59-2c6e-40f4-a1b7-138392e36585-iptables-alerter-script\") pod \"iptables-alerter-qjvz6\" (UID: \"33966c59-2c6e-40f4-a1b7-138392e36585\") " pod="openshift-network-operator/iptables-alerter-qjvz6" Apr 21 15:35:25.528106 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.527961 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/46ce2483-51db-443d-8cbc-8d669c012502-etc-sysctl-conf\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.528106 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.527947 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/53c1515a-d317-44a6-a294-1a76f1166ce9-etc-selinux\") pod \"aws-ebs-csi-driver-node-8tjgk\" (UID: \"53c1515a-d317-44a6-a294-1a76f1166ce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8tjgk" Apr 21 15:35:25.528106 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.527980 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6423248e-635c-46da-980c-a175f07c835b-agent-certs\") pod \"konnectivity-agent-zptw6\" (UID: \"6423248e-635c-46da-980c-a175f07c835b\") " pod="kube-system/konnectivity-agent-zptw6" Apr 21 15:35:25.528106 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.528020 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e625158-0ce1-4766-9988-80be7fb8ed12-system-cni-dir\") pod \"multus-additional-cni-plugins-ll86t\" (UID: \"8e625158-0ce1-4766-9988-80be7fb8ed12\") " pod="openshift-multus/multus-additional-cni-plugins-ll86t" Apr 21 15:35:25.528106 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.528029 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-host-run-ovn-kubernetes\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.528106 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.528082 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g78hj\" (UniqueName: \"kubernetes.io/projected/0e013988-1283-4f21-89bb-0200deb14502-kube-api-access-g78hj\") pod \"network-metrics-daemon-8tknf\" (UID: \"0e013988-1283-4f21-89bb-0200deb14502\") " pod="openshift-multus/network-metrics-daemon-8tknf" Apr 21 15:35:25.528426 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.528102 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/46ce2483-51db-443d-8cbc-8d669c012502-etc-tuned\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.528671 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.527829 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-ovn-node-metrics-cert\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.528922 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.528900 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8e625158-0ce1-4766-9988-80be7fb8ed12-cni-binary-copy\") pod \"multus-additional-cni-plugins-ll86t\" (UID: \"8e625158-0ce1-4766-9988-80be7fb8ed12\") " pod="openshift-multus/multus-additional-cni-plugins-ll86t" Apr 21 15:35:25.529018 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.528996 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/33966c59-2c6e-40f4-a1b7-138392e36585-iptables-alerter-script\") pod \"iptables-alerter-qjvz6\" (UID: \"33966c59-2c6e-40f4-a1b7-138392e36585\") " pod="openshift-network-operator/iptables-alerter-qjvz6" Apr 21 15:35:25.529083 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.529009 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/53c1515a-d317-44a6-a294-1a76f1166ce9-socket-dir\") pod \"aws-ebs-csi-driver-node-8tjgk\" (UID: \"53c1515a-d317-44a6-a294-1a76f1166ce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8tjgk" Apr 21 15:35:25.529158 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.529098 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-log-socket\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.529216 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.529170 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-run-openvswitch\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.529449 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.529427 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-multus-conf-dir\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.529534 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.529514 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/33966c59-2c6e-40f4-a1b7-138392e36585-host-slash\") pod \"iptables-alerter-qjvz6\" (UID: \"33966c59-2c6e-40f4-a1b7-138392e36585\") " pod="openshift-network-operator/iptables-alerter-qjvz6" Apr 21 15:35:25.529592 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.529482 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6423248e-635c-46da-980c-a175f07c835b-konnectivity-ca\") pod \"konnectivity-agent-zptw6\" (UID: \"6423248e-635c-46da-980c-a175f07c835b\") " pod="kube-system/konnectivity-agent-zptw6" Apr 21 15:35:25.529592 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.527879 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-cni-binary-copy\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.529685 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.529626 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/46ce2483-51db-443d-8cbc-8d669c012502-var-lib-kubelet\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.529685 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.529628 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e625158-0ce1-4766-9988-80be7fb8ed12-system-cni-dir\") pod \"multus-additional-cni-plugins-ll86t\" (UID: \"8e625158-0ce1-4766-9988-80be7fb8ed12\") " pod="openshift-multus/multus-additional-cni-plugins-ll86t" Apr 21 15:35:25.529685 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.529641 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-multus-socket-dir-parent\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.529685 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.529654 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8e625158-0ce1-4766-9988-80be7fb8ed12-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ll86t\" (UID: \"8e625158-0ce1-4766-9988-80be7fb8ed12\") " pod="openshift-multus/multus-additional-cni-plugins-ll86t" Apr 21 15:35:25.529867 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.529700 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-host-run-netns\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.530021 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.530004 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53c1515a-d317-44a6-a294-1a76f1166ce9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8tjgk\" (UID: \"53c1515a-d317-44a6-a294-1a76f1166ce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8tjgk" Apr 21 15:35:25.530108 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.530028 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-ovnkube-config\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.530406 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.530382 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/46ce2483-51db-443d-8cbc-8d669c012502-tmp\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.532582 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:25.532530 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:25.532582 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:25.532561 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:25.532582 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:25.532574 2569 projected.go:194] Error preparing data for projected volume kube-api-access-c9qf9 for pod openshift-network-diagnostics/network-check-target-4vnh5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:25.532757 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:25.532643 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4e63b458-1e99-4bb3-bb96-259b69e04282-kube-api-access-c9qf9 podName:4e63b458-1e99-4bb3-bb96-259b69e04282 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:26.03262593 +0000 UTC m=+3.135077531 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-c9qf9" (UniqueName: "kubernetes.io/projected/4e63b458-1e99-4bb3-bb96-259b69e04282-kube-api-access-c9qf9") pod "network-check-target-4vnh5" (UID: "4e63b458-1e99-4bb3-bb96-259b69e04282") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:25.533554 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.533529 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6423248e-635c-46da-980c-a175f07c835b-agent-certs\") pod \"konnectivity-agent-zptw6\" (UID: \"6423248e-635c-46da-980c-a175f07c835b\") " pod="kube-system/konnectivity-agent-zptw6" Apr 21 15:35:25.534391 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.534372 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvq9j\" (UniqueName: \"kubernetes.io/projected/33966c59-2c6e-40f4-a1b7-138392e36585-kube-api-access-xvq9j\") pod \"iptables-alerter-qjvz6\" (UID: \"33966c59-2c6e-40f4-a1b7-138392e36585\") " pod="openshift-network-operator/iptables-alerter-qjvz6" Apr 21 15:35:25.535068 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.535045 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-66gzv\" (UniqueName: \"kubernetes.io/projected/6d57d7d1-2c2c-4e3f-b32d-771603839fe4-kube-api-access-66gzv\") pod \"ovnkube-node-xwc9h\" (UID: \"6d57d7d1-2c2c-4e3f-b32d-771603839fe4\") " pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.535374 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.535353 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s494h\" (UniqueName: \"kubernetes.io/projected/46ce2483-51db-443d-8cbc-8d669c012502-kube-api-access-s494h\") pod \"tuned-x785v\" (UID: \"46ce2483-51db-443d-8cbc-8d669c012502\") " pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.537582 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.537560 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvxv9\" (UniqueName: \"kubernetes.io/projected/53c1515a-d317-44a6-a294-1a76f1166ce9-kube-api-access-kvxv9\") pod \"aws-ebs-csi-driver-node-8tjgk\" (UID: \"53c1515a-d317-44a6-a294-1a76f1166ce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8tjgk" Apr 21 15:35:25.537676 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.537582 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rqqs\" (UniqueName: \"kubernetes.io/projected/8e625158-0ce1-4766-9988-80be7fb8ed12-kube-api-access-4rqqs\") pod \"multus-additional-cni-plugins-ll86t\" (UID: \"8e625158-0ce1-4766-9988-80be7fb8ed12\") " pod="openshift-multus/multus-additional-cni-plugins-ll86t" Apr 21 15:35:25.537752 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.537735 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7sz5\" (UniqueName: \"kubernetes.io/projected/0187917a-3b91-4173-9632-8211e2adc77e-kube-api-access-n7sz5\") pod \"node-ca-94bl8\" (UID: \"0187917a-3b91-4173-9632-8211e2adc77e\") " pod="openshift-image-registry/node-ca-94bl8" Apr 21 15:35:25.543742 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.543721 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nwj7\" (UniqueName: \"kubernetes.io/projected/c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2-kube-api-access-7nwj7\") pod \"multus-84m4w\" (UID: \"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2\") " pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.553555 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.553533 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g78hj\" (UniqueName: \"kubernetes.io/projected/0e013988-1283-4f21-89bb-0200deb14502-kube-api-access-g78hj\") pod \"network-metrics-daemon-8tknf\" (UID: \"0e013988-1283-4f21-89bb-0200deb14502\") " pod="openshift-multus/network-metrics-daemon-8tknf" Apr 21 15:35:25.696473 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.696434 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ll86t" Apr 21 15:35:25.704300 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.704277 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:25.712066 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.712044 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8tjgk" Apr 21 15:35:25.716815 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.716785 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-84m4w" Apr 21 15:35:25.725390 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.725368 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qjvz6" Apr 21 15:35:25.732117 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.732096 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-zptw6" Apr 21 15:35:25.737614 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.737593 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-x785v" Apr 21 15:35:25.743142 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.743100 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-94bl8" Apr 21 15:35:25.748257 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:25.748238 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:35:26.032393 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:26.032310 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e013988-1283-4f21-89bb-0200deb14502-metrics-certs\") pod \"network-metrics-daemon-8tknf\" (UID: \"0e013988-1283-4f21-89bb-0200deb14502\") " pod="openshift-multus/network-metrics-daemon-8tknf" Apr 21 15:35:26.032553 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:26.032453 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:26.032553 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:26.032543 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e013988-1283-4f21-89bb-0200deb14502-metrics-certs podName:0e013988-1283-4f21-89bb-0200deb14502 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:27.032513877 +0000 UTC m=+4.134965489 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e013988-1283-4f21-89bb-0200deb14502-metrics-certs") pod "network-metrics-daemon-8tknf" (UID: "0e013988-1283-4f21-89bb-0200deb14502") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:26.071880 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:26.071848 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6423248e_635c_46da_980c_a175f07c835b.slice/crio-33ff2ab124c6cb9a0367d75f19317ff643c6a3c2d74b6e91eac883b7b69b6bb1 WatchSource:0}: Error finding container 33ff2ab124c6cb9a0367d75f19317ff643c6a3c2d74b6e91eac883b7b69b6bb1: Status 404 returned error can't find the container with id 33ff2ab124c6cb9a0367d75f19317ff643c6a3c2d74b6e91eac883b7b69b6bb1 Apr 21 15:35:26.072604 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:26.072509 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0187917a_3b91_4173_9632_8211e2adc77e.slice/crio-d266b932230b8e0007c5275b0994cc08e0ccfbc4a6d01ee1f918f560b7460182 WatchSource:0}: Error finding container d266b932230b8e0007c5275b0994cc08e0ccfbc4a6d01ee1f918f560b7460182: Status 404 returned error can't find the container with id d266b932230b8e0007c5275b0994cc08e0ccfbc4a6d01ee1f918f560b7460182 Apr 21 15:35:26.073783 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:26.073653 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e625158_0ce1_4766_9988_80be7fb8ed12.slice/crio-f0513952b5d148e89c153c6b091259e508cf2390eb5a9851da759ae2f1c429ea WatchSource:0}: Error finding container f0513952b5d148e89c153c6b091259e508cf2390eb5a9851da759ae2f1c429ea: Status 404 returned error can't find the container with id f0513952b5d148e89c153c6b091259e508cf2390eb5a9851da759ae2f1c429ea Apr 21 15:35:26.076706 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:26.076679 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33966c59_2c6e_40f4_a1b7_138392e36585.slice/crio-cbfef7acecd591d809380379c8f7229c432592b8e62af224a91696b22ce73866 WatchSource:0}: Error finding container cbfef7acecd591d809380379c8f7229c432592b8e62af224a91696b22ce73866: Status 404 returned error can't find the container with id cbfef7acecd591d809380379c8f7229c432592b8e62af224a91696b22ce73866 Apr 21 15:35:26.078755 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:26.078732 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46ce2483_51db_443d_8cbc_8d669c012502.slice/crio-c812715b9713a54fa2ffc9d3ab8d6d4f67d746309f9013ba3076ec04263725fa WatchSource:0}: Error finding container c812715b9713a54fa2ffc9d3ab8d6d4f67d746309f9013ba3076ec04263725fa: Status 404 returned error can't find the container with id c812715b9713a54fa2ffc9d3ab8d6d4f67d746309f9013ba3076ec04263725fa Apr 21 15:35:26.080021 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:26.079998 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc11e8a8b_9ef5_4ed8_a05e_9e1d9ad466d2.slice/crio-6dc296da9162be89e688df459bd586d39a7781f5c300491cb6216b4a9bb73489 WatchSource:0}: Error finding container 6dc296da9162be89e688df459bd586d39a7781f5c300491cb6216b4a9bb73489: Status 404 returned error can't find the container with id 6dc296da9162be89e688df459bd586d39a7781f5c300491cb6216b4a9bb73489 Apr 21 15:35:26.081423 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:26.081388 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53c1515a_d317_44a6_a294_1a76f1166ce9.slice/crio-91b0d71bf23c27c5661d4887f0369dc6b3dbc5d8ceb3617460a645b69d46f236 WatchSource:0}: Error finding container 91b0d71bf23c27c5661d4887f0369dc6b3dbc5d8ceb3617460a645b69d46f236: Status 404 returned error can't find the container with id 91b0d71bf23c27c5661d4887f0369dc6b3dbc5d8ceb3617460a645b69d46f236 Apr 21 15:35:26.084064 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:26.084019 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d57d7d1_2c2c_4e3f_b32d_771603839fe4.slice/crio-b7f65f9c75e6eff46643aabc61e72afc2133a28a783aade790d8c7b085400c2a WatchSource:0}: Error finding container b7f65f9c75e6eff46643aabc61e72afc2133a28a783aade790d8c7b085400c2a: Status 404 returned error can't find the container with id b7f65f9c75e6eff46643aabc61e72afc2133a28a783aade790d8c7b085400c2a Apr 21 15:35:26.133355 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:26.133328 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9qf9\" (UniqueName: \"kubernetes.io/projected/4e63b458-1e99-4bb3-bb96-259b69e04282-kube-api-access-c9qf9\") pod \"network-check-target-4vnh5\" (UID: \"4e63b458-1e99-4bb3-bb96-259b69e04282\") " pod="openshift-network-diagnostics/network-check-target-4vnh5" Apr 21 15:35:26.133483 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:26.133464 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:26.133483 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:26.133479 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:26.133571 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:26.133488 2569 projected.go:194] Error preparing data for projected volume kube-api-access-c9qf9 for pod openshift-network-diagnostics/network-check-target-4vnh5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:26.133571 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:26.133545 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4e63b458-1e99-4bb3-bb96-259b69e04282-kube-api-access-c9qf9 podName:4e63b458-1e99-4bb3-bb96-259b69e04282 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:27.133527491 +0000 UTC m=+4.235979108 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-c9qf9" (UniqueName: "kubernetes.io/projected/4e63b458-1e99-4bb3-bb96-259b69e04282-kube-api-access-c9qf9") pod "network-check-target-4vnh5" (UID: "4e63b458-1e99-4bb3-bb96-259b69e04282") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:26.467333 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:26.467287 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 15:30:24 +0000 UTC" deadline="2027-10-13 01:30:00.331592404 +0000 UTC" Apr 21 15:35:26.467710 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:26.467329 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12945h54m33.864267271s" Apr 21 15:35:26.557352 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:26.557312 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-94bl8" event={"ID":"0187917a-3b91-4173-9632-8211e2adc77e","Type":"ContainerStarted","Data":"d266b932230b8e0007c5275b0994cc08e0ccfbc4a6d01ee1f918f560b7460182"} Apr 21 15:35:26.561048 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:26.560692 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ll86t" event={"ID":"8e625158-0ce1-4766-9988-80be7fb8ed12","Type":"ContainerStarted","Data":"f0513952b5d148e89c153c6b091259e508cf2390eb5a9851da759ae2f1c429ea"} Apr 21 15:35:26.565055 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:26.564652 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-158.ec2.internal" event={"ID":"35017106fd99d25a848bf349caf9d842","Type":"ContainerStarted","Data":"12bd7e7421a8315ec9bd452c6d9c8de7fa59ef04a46b4c2903a398019369805b"} Apr 21 15:35:26.567643 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:26.567541 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-84m4w" event={"ID":"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2","Type":"ContainerStarted","Data":"6dc296da9162be89e688df459bd586d39a7781f5c300491cb6216b4a9bb73489"} Apr 21 15:35:26.569687 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:26.569574 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" event={"ID":"6d57d7d1-2c2c-4e3f-b32d-771603839fe4","Type":"ContainerStarted","Data":"b7f65f9c75e6eff46643aabc61e72afc2133a28a783aade790d8c7b085400c2a"} Apr 21 15:35:26.571972 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:26.571876 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8tjgk" event={"ID":"53c1515a-d317-44a6-a294-1a76f1166ce9","Type":"ContainerStarted","Data":"91b0d71bf23c27c5661d4887f0369dc6b3dbc5d8ceb3617460a645b69d46f236"} Apr 21 15:35:26.575791 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:26.575763 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qjvz6" event={"ID":"33966c59-2c6e-40f4-a1b7-138392e36585","Type":"ContainerStarted","Data":"cbfef7acecd591d809380379c8f7229c432592b8e62af224a91696b22ce73866"} Apr 21 15:35:26.577070 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:26.577045 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-zptw6" event={"ID":"6423248e-635c-46da-980c-a175f07c835b","Type":"ContainerStarted","Data":"33ff2ab124c6cb9a0367d75f19317ff643c6a3c2d74b6e91eac883b7b69b6bb1"} Apr 21 15:35:26.578516 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:26.578486 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-x785v" event={"ID":"46ce2483-51db-443d-8cbc-8d669c012502","Type":"ContainerStarted","Data":"c812715b9713a54fa2ffc9d3ab8d6d4f67d746309f9013ba3076ec04263725fa"} Apr 21 15:35:27.040664 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:27.040622 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e013988-1283-4f21-89bb-0200deb14502-metrics-certs\") pod \"network-metrics-daemon-8tknf\" (UID: \"0e013988-1283-4f21-89bb-0200deb14502\") " pod="openshift-multus/network-metrics-daemon-8tknf" Apr 21 15:35:27.040856 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:27.040773 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:27.040856 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:27.040846 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e013988-1283-4f21-89bb-0200deb14502-metrics-certs podName:0e013988-1283-4f21-89bb-0200deb14502 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:29.040820656 +0000 UTC m=+6.143272258 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e013988-1283-4f21-89bb-0200deb14502-metrics-certs") pod "network-metrics-daemon-8tknf" (UID: "0e013988-1283-4f21-89bb-0200deb14502") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:27.141618 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:27.141537 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9qf9\" (UniqueName: \"kubernetes.io/projected/4e63b458-1e99-4bb3-bb96-259b69e04282-kube-api-access-c9qf9\") pod \"network-check-target-4vnh5\" (UID: \"4e63b458-1e99-4bb3-bb96-259b69e04282\") " pod="openshift-network-diagnostics/network-check-target-4vnh5" Apr 21 15:35:27.141774 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:27.141695 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:27.141774 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:27.141715 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:27.141774 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:27.141727 2569 projected.go:194] Error preparing data for projected volume kube-api-access-c9qf9 for pod openshift-network-diagnostics/network-check-target-4vnh5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:27.141923 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:27.141796 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4e63b458-1e99-4bb3-bb96-259b69e04282-kube-api-access-c9qf9 podName:4e63b458-1e99-4bb3-bb96-259b69e04282 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:29.141777907 +0000 UTC m=+6.244229526 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-c9qf9" (UniqueName: "kubernetes.io/projected/4e63b458-1e99-4bb3-bb96-259b69e04282-kube-api-access-c9qf9") pod "network-check-target-4vnh5" (UID: "4e63b458-1e99-4bb3-bb96-259b69e04282") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:27.542811 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:27.542775 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tknf" Apr 21 15:35:27.543291 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:27.542934 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tknf" podUID="0e013988-1283-4f21-89bb-0200deb14502" Apr 21 15:35:27.544055 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:27.544025 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4vnh5" Apr 21 15:35:27.549687 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:27.549599 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4vnh5" podUID="4e63b458-1e99-4bb3-bb96-259b69e04282" Apr 21 15:35:27.612352 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:27.612310 2569 generic.go:358] "Generic (PLEG): container finished" podID="346ecef21a63285028e6bf61503fc2e3" containerID="b23ded6fd6ab1546e49c2bdb2f231d37a917ea6a8ff23210acbbec6ec156c961" exitCode=0 Apr 21 15:35:27.612864 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:27.612823 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-158.ec2.internal" event={"ID":"346ecef21a63285028e6bf61503fc2e3","Type":"ContainerDied","Data":"b23ded6fd6ab1546e49c2bdb2f231d37a917ea6a8ff23210acbbec6ec156c961"} Apr 21 15:35:27.628990 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:27.628938 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-158.ec2.internal" podStartSLOduration=3.628918975 podStartE2EDuration="3.628918975s" podCreationTimestamp="2026-04-21 15:35:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:35:26.581378154 +0000 UTC m=+3.683829800" watchObservedRunningTime="2026-04-21 15:35:27.628918975 +0000 UTC m=+4.731370596" Apr 21 15:35:28.621239 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:28.620531 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-158.ec2.internal" event={"ID":"346ecef21a63285028e6bf61503fc2e3","Type":"ContainerStarted","Data":"7efc44bc2a42d10b5984995fc72413e2431a457503f8c3cfc1e4df35b4eea2d9"} Apr 21 15:35:29.058476 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:29.057867 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e013988-1283-4f21-89bb-0200deb14502-metrics-certs\") pod \"network-metrics-daemon-8tknf\" (UID: \"0e013988-1283-4f21-89bb-0200deb14502\") " pod="openshift-multus/network-metrics-daemon-8tknf" Apr 21 15:35:29.058476 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:29.058040 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:29.058476 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:29.058117 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e013988-1283-4f21-89bb-0200deb14502-metrics-certs podName:0e013988-1283-4f21-89bb-0200deb14502 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:33.058096672 +0000 UTC m=+10.160548294 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e013988-1283-4f21-89bb-0200deb14502-metrics-certs") pod "network-metrics-daemon-8tknf" (UID: "0e013988-1283-4f21-89bb-0200deb14502") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:29.159179 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:29.159122 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9qf9\" (UniqueName: \"kubernetes.io/projected/4e63b458-1e99-4bb3-bb96-259b69e04282-kube-api-access-c9qf9\") pod \"network-check-target-4vnh5\" (UID: \"4e63b458-1e99-4bb3-bb96-259b69e04282\") " pod="openshift-network-diagnostics/network-check-target-4vnh5" Apr 21 15:35:29.159355 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:29.159325 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:29.159355 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:29.159344 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:29.159476 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:29.159357 2569 projected.go:194] Error preparing data for projected volume kube-api-access-c9qf9 for pod openshift-network-diagnostics/network-check-target-4vnh5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:29.159476 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:29.159427 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4e63b458-1e99-4bb3-bb96-259b69e04282-kube-api-access-c9qf9 podName:4e63b458-1e99-4bb3-bb96-259b69e04282 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:33.159407558 +0000 UTC m=+10.261859163 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-c9qf9" (UniqueName: "kubernetes.io/projected/4e63b458-1e99-4bb3-bb96-259b69e04282-kube-api-access-c9qf9") pod "network-check-target-4vnh5" (UID: "4e63b458-1e99-4bb3-bb96-259b69e04282") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:29.543249 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:29.542723 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tknf" Apr 21 15:35:29.543249 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:29.542873 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tknf" podUID="0e013988-1283-4f21-89bb-0200deb14502" Apr 21 15:35:29.543249 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:29.543202 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4vnh5" Apr 21 15:35:29.543547 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:29.543312 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4vnh5" podUID="4e63b458-1e99-4bb3-bb96-259b69e04282" Apr 21 15:35:31.543819 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:31.543268 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tknf" Apr 21 15:35:31.543819 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:31.543446 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tknf" podUID="0e013988-1283-4f21-89bb-0200deb14502" Apr 21 15:35:31.543819 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:31.543476 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4vnh5" Apr 21 15:35:31.543819 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:31.543577 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4vnh5" podUID="4e63b458-1e99-4bb3-bb96-259b69e04282" Apr 21 15:35:33.091671 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:33.091576 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e013988-1283-4f21-89bb-0200deb14502-metrics-certs\") pod \"network-metrics-daemon-8tknf\" (UID: \"0e013988-1283-4f21-89bb-0200deb14502\") " pod="openshift-multus/network-metrics-daemon-8tknf" Apr 21 15:35:33.092157 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:33.091742 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:33.092157 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:33.091825 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e013988-1283-4f21-89bb-0200deb14502-metrics-certs podName:0e013988-1283-4f21-89bb-0200deb14502 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:41.091805078 +0000 UTC m=+18.194256694 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e013988-1283-4f21-89bb-0200deb14502-metrics-certs") pod "network-metrics-daemon-8tknf" (UID: "0e013988-1283-4f21-89bb-0200deb14502") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:33.192879 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:33.192248 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9qf9\" (UniqueName: \"kubernetes.io/projected/4e63b458-1e99-4bb3-bb96-259b69e04282-kube-api-access-c9qf9\") pod \"network-check-target-4vnh5\" (UID: \"4e63b458-1e99-4bb3-bb96-259b69e04282\") " pod="openshift-network-diagnostics/network-check-target-4vnh5" Apr 21 15:35:33.192879 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:33.192423 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:33.192879 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:33.192441 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:33.192879 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:33.192455 2569 projected.go:194] Error preparing data for projected volume kube-api-access-c9qf9 for pod openshift-network-diagnostics/network-check-target-4vnh5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:33.192879 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:33.192523 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4e63b458-1e99-4bb3-bb96-259b69e04282-kube-api-access-c9qf9 podName:4e63b458-1e99-4bb3-bb96-259b69e04282 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:41.192502151 +0000 UTC m=+18.294953773 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-c9qf9" (UniqueName: "kubernetes.io/projected/4e63b458-1e99-4bb3-bb96-259b69e04282-kube-api-access-c9qf9") pod "network-check-target-4vnh5" (UID: "4e63b458-1e99-4bb3-bb96-259b69e04282") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:33.544094 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:33.543462 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4vnh5" Apr 21 15:35:33.544094 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:33.543576 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4vnh5" podUID="4e63b458-1e99-4bb3-bb96-259b69e04282" Apr 21 15:35:33.544094 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:33.543951 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tknf" Apr 21 15:35:33.544094 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:33.544054 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tknf" podUID="0e013988-1283-4f21-89bb-0200deb14502" Apr 21 15:35:35.542499 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:35.542462 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tknf" Apr 21 15:35:35.542924 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:35.542507 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4vnh5" Apr 21 15:35:35.542924 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:35.542605 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tknf" podUID="0e013988-1283-4f21-89bb-0200deb14502" Apr 21 15:35:35.542924 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:35.542734 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4vnh5" podUID="4e63b458-1e99-4bb3-bb96-259b69e04282" Apr 21 15:35:37.542228 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:37.542180 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tknf" Apr 21 15:35:37.542644 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:37.542180 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4vnh5" Apr 21 15:35:37.542644 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:37.542340 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tknf" podUID="0e013988-1283-4f21-89bb-0200deb14502" Apr 21 15:35:37.542644 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:37.542395 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4vnh5" podUID="4e63b458-1e99-4bb3-bb96-259b69e04282" Apr 21 15:35:39.542866 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:39.542829 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tknf" Apr 21 15:35:39.543337 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:39.542884 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4vnh5" Apr 21 15:35:39.543337 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:39.542979 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tknf" podUID="0e013988-1283-4f21-89bb-0200deb14502" Apr 21 15:35:39.543337 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:39.543098 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4vnh5" podUID="4e63b458-1e99-4bb3-bb96-259b69e04282" Apr 21 15:35:40.290060 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:40.290002 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-158.ec2.internal" podStartSLOduration=16.289987302 podStartE2EDuration="16.289987302s" podCreationTimestamp="2026-04-21 15:35:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:35:28.637433424 +0000 UTC m=+5.739885044" watchObservedRunningTime="2026-04-21 15:35:40.289987302 +0000 UTC m=+17.392438922" Apr 21 15:35:40.290696 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:40.290671 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-f7s8m"] Apr 21 15:35:40.300256 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:40.300224 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-f7s8m" Apr 21 15:35:40.302715 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:40.302690 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8dt82\"" Apr 21 15:35:40.302715 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:40.302694 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 15:35:40.303717 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:40.303683 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 15:35:40.349241 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:40.349209 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5199de06-2148-4da2-80d1-c514e73d93fb-tmp-dir\") pod \"node-resolver-f7s8m\" (UID: \"5199de06-2148-4da2-80d1-c514e73d93fb\") " pod="openshift-dns/node-resolver-f7s8m" Apr 21 15:35:40.349409 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:40.349263 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsxgq\" (UniqueName: \"kubernetes.io/projected/5199de06-2148-4da2-80d1-c514e73d93fb-kube-api-access-vsxgq\") pod \"node-resolver-f7s8m\" (UID: \"5199de06-2148-4da2-80d1-c514e73d93fb\") " pod="openshift-dns/node-resolver-f7s8m" Apr 21 15:35:40.349409 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:40.349352 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5199de06-2148-4da2-80d1-c514e73d93fb-hosts-file\") pod \"node-resolver-f7s8m\" (UID: \"5199de06-2148-4da2-80d1-c514e73d93fb\") " pod="openshift-dns/node-resolver-f7s8m" Apr 21 15:35:40.450693 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:40.450661 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5199de06-2148-4da2-80d1-c514e73d93fb-hosts-file\") pod \"node-resolver-f7s8m\" (UID: \"5199de06-2148-4da2-80d1-c514e73d93fb\") " pod="openshift-dns/node-resolver-f7s8m" Apr 21 15:35:40.450890 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:40.450729 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5199de06-2148-4da2-80d1-c514e73d93fb-tmp-dir\") pod \"node-resolver-f7s8m\" (UID: \"5199de06-2148-4da2-80d1-c514e73d93fb\") " pod="openshift-dns/node-resolver-f7s8m" Apr 21 15:35:40.450890 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:40.450758 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vsxgq\" (UniqueName: \"kubernetes.io/projected/5199de06-2148-4da2-80d1-c514e73d93fb-kube-api-access-vsxgq\") pod \"node-resolver-f7s8m\" (UID: \"5199de06-2148-4da2-80d1-c514e73d93fb\") " pod="openshift-dns/node-resolver-f7s8m" Apr 21 15:35:40.450890 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:40.450806 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5199de06-2148-4da2-80d1-c514e73d93fb-hosts-file\") pod \"node-resolver-f7s8m\" (UID: \"5199de06-2148-4da2-80d1-c514e73d93fb\") " pod="openshift-dns/node-resolver-f7s8m" Apr 21 15:35:40.451152 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:40.451108 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5199de06-2148-4da2-80d1-c514e73d93fb-tmp-dir\") pod \"node-resolver-f7s8m\" (UID: \"5199de06-2148-4da2-80d1-c514e73d93fb\") " pod="openshift-dns/node-resolver-f7s8m" Apr 21 15:35:40.462291 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:40.462268 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsxgq\" (UniqueName: \"kubernetes.io/projected/5199de06-2148-4da2-80d1-c514e73d93fb-kube-api-access-vsxgq\") pod \"node-resolver-f7s8m\" (UID: \"5199de06-2148-4da2-80d1-c514e73d93fb\") " pod="openshift-dns/node-resolver-f7s8m" Apr 21 15:35:40.609985 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:40.609898 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-f7s8m" Apr 21 15:35:41.156845 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:41.156803 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e013988-1283-4f21-89bb-0200deb14502-metrics-certs\") pod \"network-metrics-daemon-8tknf\" (UID: \"0e013988-1283-4f21-89bb-0200deb14502\") " pod="openshift-multus/network-metrics-daemon-8tknf" Apr 21 15:35:41.157020 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:41.156976 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:41.157089 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:41.157058 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e013988-1283-4f21-89bb-0200deb14502-metrics-certs podName:0e013988-1283-4f21-89bb-0200deb14502 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:57.157038089 +0000 UTC m=+34.259489705 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e013988-1283-4f21-89bb-0200deb14502-metrics-certs") pod "network-metrics-daemon-8tknf" (UID: "0e013988-1283-4f21-89bb-0200deb14502") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:41.258198 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:41.258156 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9qf9\" (UniqueName: \"kubernetes.io/projected/4e63b458-1e99-4bb3-bb96-259b69e04282-kube-api-access-c9qf9\") pod \"network-check-target-4vnh5\" (UID: \"4e63b458-1e99-4bb3-bb96-259b69e04282\") " pod="openshift-network-diagnostics/network-check-target-4vnh5" Apr 21 15:35:41.258379 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:41.258292 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:41.258379 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:41.258309 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:41.258379 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:41.258322 2569 projected.go:194] Error preparing data for projected volume kube-api-access-c9qf9 for pod openshift-network-diagnostics/network-check-target-4vnh5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:41.258379 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:41.258378 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4e63b458-1e99-4bb3-bb96-259b69e04282-kube-api-access-c9qf9 podName:4e63b458-1e99-4bb3-bb96-259b69e04282 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:57.258361691 +0000 UTC m=+34.360813292 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-c9qf9" (UniqueName: "kubernetes.io/projected/4e63b458-1e99-4bb3-bb96-259b69e04282-kube-api-access-c9qf9") pod "network-check-target-4vnh5" (UID: "4e63b458-1e99-4bb3-bb96-259b69e04282") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:41.542868 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:41.542833 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4vnh5" Apr 21 15:35:41.543031 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:41.542836 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tknf" Apr 21 15:35:41.543031 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:41.542954 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4vnh5" podUID="4e63b458-1e99-4bb3-bb96-259b69e04282" Apr 21 15:35:41.543125 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:41.543065 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tknf" podUID="0e013988-1283-4f21-89bb-0200deb14502" Apr 21 15:35:42.959661 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:42.959442 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5199de06_2148_4da2_80d1_c514e73d93fb.slice/crio-35afe08e3598a8ce205a01c0db9824328f059d645b4865845f1db4404d94c717 WatchSource:0}: Error finding container 35afe08e3598a8ce205a01c0db9824328f059d645b4865845f1db4404d94c717: Status 404 returned error can't find the container with id 35afe08e3598a8ce205a01c0db9824328f059d645b4865845f1db4404d94c717 Apr 21 15:35:43.543509 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:43.543341 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tknf" Apr 21 15:35:43.543601 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:43.543422 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4vnh5" Apr 21 15:35:43.543601 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:43.543588 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tknf" podUID="0e013988-1283-4f21-89bb-0200deb14502" Apr 21 15:35:43.543811 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:43.543669 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4vnh5" podUID="4e63b458-1e99-4bb3-bb96-259b69e04282" Apr 21 15:35:43.646167 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:43.646119 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-84m4w" event={"ID":"c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2","Type":"ContainerStarted","Data":"51bfbe92ff6b7a93b0bb032f62644cd07c81253a386583598f57ea56ae4255f1"} Apr 21 15:35:43.648532 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:43.648500 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" event={"ID":"6d57d7d1-2c2c-4e3f-b32d-771603839fe4","Type":"ContainerStarted","Data":"d3cb6ded09ef238fe9fc977ed05e09b73b88419f0620d406c52311ce51f89504"} Apr 21 15:35:43.648621 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:43.648540 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" event={"ID":"6d57d7d1-2c2c-4e3f-b32d-771603839fe4","Type":"ContainerStarted","Data":"70ade37c8ee7c043cfed6e5cde40f44843e716c35a139ce192e07db04fd2b615"} Apr 21 15:35:43.649841 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:43.649806 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8tjgk" event={"ID":"53c1515a-d317-44a6-a294-1a76f1166ce9","Type":"ContainerStarted","Data":"095efe5013e549e5d9b267cd56dac7c1c319d0985c26cb5ecd3feed40ddba218"} Apr 21 15:35:43.651470 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:43.651447 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-zptw6" event={"ID":"6423248e-635c-46da-980c-a175f07c835b","Type":"ContainerStarted","Data":"fac1e6777e36b8674d91fcb97179daa52a28a1da1fc44065c44123ad51da0d8f"} Apr 21 15:35:43.652868 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:43.652845 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-f7s8m" event={"ID":"5199de06-2148-4da2-80d1-c514e73d93fb","Type":"ContainerStarted","Data":"9f4e5f0e559a92af90979824cdb2e0d8be7680bd34c7a11e81e86fc48265e8a5"} Apr 21 15:35:43.652941 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:43.652877 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-f7s8m" event={"ID":"5199de06-2148-4da2-80d1-c514e73d93fb","Type":"ContainerStarted","Data":"35afe08e3598a8ce205a01c0db9824328f059d645b4865845f1db4404d94c717"} Apr 21 15:35:43.654635 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:43.654613 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-x785v" event={"ID":"46ce2483-51db-443d-8cbc-8d669c012502","Type":"ContainerStarted","Data":"f9adccaad199a64f26725adea90af50ffb7c8e6e0815f6edf22b2aa229bc7e02"} Apr 21 15:35:43.655973 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:43.655951 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-94bl8" event={"ID":"0187917a-3b91-4173-9632-8211e2adc77e","Type":"ContainerStarted","Data":"4ca463f2707011352041c851f02ffce96ebfb29e2464fe144a1fee4b3f97162a"} Apr 21 15:35:43.658028 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:43.657999 2569 generic.go:358] "Generic (PLEG): container finished" podID="8e625158-0ce1-4766-9988-80be7fb8ed12" containerID="20d68800d69db04725209a06a8feb844b28e5322f63de8647d42f71704c932fa" exitCode=0 Apr 21 15:35:43.658125 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:43.658044 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ll86t" event={"ID":"8e625158-0ce1-4766-9988-80be7fb8ed12","Type":"ContainerDied","Data":"20d68800d69db04725209a06a8feb844b28e5322f63de8647d42f71704c932fa"} Apr 21 15:35:43.686753 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:43.686697 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-84m4w" podStartSLOduration=3.814667274 podStartE2EDuration="20.686681145s" podCreationTimestamp="2026-04-21 15:35:23 +0000 UTC" firstStartedPulling="2026-04-21 15:35:26.082560983 +0000 UTC m=+3.185012583" lastFinishedPulling="2026-04-21 15:35:42.954574854 +0000 UTC m=+20.057026454" observedRunningTime="2026-04-21 15:35:43.686571262 +0000 UTC m=+20.789022885" watchObservedRunningTime="2026-04-21 15:35:43.686681145 +0000 UTC m=+20.789132768" Apr 21 15:35:43.724030 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:43.723976 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-zptw6" podStartSLOduration=3.883380206 podStartE2EDuration="20.723958086s" podCreationTimestamp="2026-04-21 15:35:23 +0000 UTC" firstStartedPulling="2026-04-21 15:35:26.075466308 +0000 UTC m=+3.177917909" lastFinishedPulling="2026-04-21 15:35:42.916044186 +0000 UTC m=+20.018495789" observedRunningTime="2026-04-21 15:35:43.723650771 +0000 UTC m=+20.826102389" watchObservedRunningTime="2026-04-21 15:35:43.723958086 +0000 UTC m=+20.826409706" Apr 21 15:35:43.753064 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:43.753023 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-x785v" podStartSLOduration=3.917727877 podStartE2EDuration="20.753008127s" podCreationTimestamp="2026-04-21 15:35:23 +0000 UTC" firstStartedPulling="2026-04-21 15:35:26.080729939 +0000 UTC m=+3.183181538" lastFinishedPulling="2026-04-21 15:35:42.916010184 +0000 UTC m=+20.018461788" observedRunningTime="2026-04-21 15:35:43.752721045 +0000 UTC m=+20.855172667" watchObservedRunningTime="2026-04-21 15:35:43.753008127 +0000 UTC m=+20.855459747" Apr 21 15:35:43.775949 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:43.775906 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-94bl8" podStartSLOduration=3.935360917 podStartE2EDuration="20.775892035s" podCreationTimestamp="2026-04-21 15:35:23 +0000 UTC" firstStartedPulling="2026-04-21 15:35:26.075467717 +0000 UTC m=+3.177919317" lastFinishedPulling="2026-04-21 15:35:42.915998837 +0000 UTC m=+20.018450435" observedRunningTime="2026-04-21 15:35:43.775813627 +0000 UTC m=+20.878265248" watchObservedRunningTime="2026-04-21 15:35:43.775892035 +0000 UTC m=+20.878343656" Apr 21 15:35:44.106858 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:44.106641 2569 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 15:35:44.483381 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:44.483278 2569 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T15:35:44.106855215Z","UUID":"9ca6322a-faed-4ee5-b0dd-4f7f2519b41e","Handler":null,"Name":"","Endpoint":""} Apr 21 15:35:44.486383 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:44.486357 2569 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 15:35:44.486383 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:44.486390 2569 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 15:35:44.663881 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:44.663802 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" event={"ID":"6d57d7d1-2c2c-4e3f-b32d-771603839fe4","Type":"ContainerStarted","Data":"210c9cfe701afa3ee4e32224cb45fa1768f3eb27cce3f20aafef45c3e64a9bdb"} Apr 21 15:35:44.663881 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:44.663847 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" event={"ID":"6d57d7d1-2c2c-4e3f-b32d-771603839fe4","Type":"ContainerStarted","Data":"85549dbe4c9533d37a0006a83ecc6ec141c6e56bb219039cf3acd3306ca572f5"} Apr 21 15:35:44.663881 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:44.663862 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" event={"ID":"6d57d7d1-2c2c-4e3f-b32d-771603839fe4","Type":"ContainerStarted","Data":"dd6ff5b02171a018853b5a3dce4f58c41ef5c74ef27f624e57e3450a307a5daa"} Apr 21 15:35:44.663881 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:44.663876 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" event={"ID":"6d57d7d1-2c2c-4e3f-b32d-771603839fe4","Type":"ContainerStarted","Data":"b1119d0ab818d968b2b582d95c08d318084b8272dd1fc1db99487917c6f0c6df"} Apr 21 15:35:44.665813 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:44.665750 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8tjgk" event={"ID":"53c1515a-d317-44a6-a294-1a76f1166ce9","Type":"ContainerStarted","Data":"302779a6973183d859d1b3553deae504141c2ead1050a7419e4d37aa418f91b3"} Apr 21 15:35:44.668205 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:44.668176 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qjvz6" event={"ID":"33966c59-2c6e-40f4-a1b7-138392e36585","Type":"ContainerStarted","Data":"79f4c8ca5070cd63a4b2f9364b8e28d84c6e1fd7298dcc8b7d59fe9f9ec88cce"} Apr 21 15:35:44.685330 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:44.685272 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-qjvz6" podStartSLOduration=4.848042439 podStartE2EDuration="21.685252112s" podCreationTimestamp="2026-04-21 15:35:23 +0000 UTC" firstStartedPulling="2026-04-21 15:35:26.078813566 +0000 UTC m=+3.181265165" lastFinishedPulling="2026-04-21 15:35:42.916023236 +0000 UTC m=+20.018474838" observedRunningTime="2026-04-21 15:35:44.684985656 +0000 UTC m=+21.787437278" watchObservedRunningTime="2026-04-21 15:35:44.685252112 +0000 UTC m=+21.787703734" Apr 21 15:35:44.685948 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:44.685883 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-f7s8m" podStartSLOduration=4.685872744 podStartE2EDuration="4.685872744s" podCreationTimestamp="2026-04-21 15:35:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:35:43.889545352 +0000 UTC m=+20.991996973" watchObservedRunningTime="2026-04-21 15:35:44.685872744 +0000 UTC m=+21.788324366" Apr 21 15:35:45.542555 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:45.542516 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tknf" Apr 21 15:35:45.543150 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:45.542519 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4vnh5" Apr 21 15:35:45.543150 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:45.542664 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tknf" podUID="0e013988-1283-4f21-89bb-0200deb14502" Apr 21 15:35:45.543150 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:45.542726 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4vnh5" podUID="4e63b458-1e99-4bb3-bb96-259b69e04282" Apr 21 15:35:45.671521 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:45.671482 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8tjgk" event={"ID":"53c1515a-d317-44a6-a294-1a76f1166ce9","Type":"ContainerStarted","Data":"3d2164f534dd6c21599be3d497a1928a295c729a912fdcac27eae1248f05fd0c"} Apr 21 15:35:45.692282 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:45.692235 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8tjgk" podStartSLOduration=3.663062671 podStartE2EDuration="22.69222042s" podCreationTimestamp="2026-04-21 15:35:23 +0000 UTC" firstStartedPulling="2026-04-21 15:35:26.086447558 +0000 UTC m=+3.188899158" lastFinishedPulling="2026-04-21 15:35:45.115605302 +0000 UTC m=+22.218056907" observedRunningTime="2026-04-21 15:35:45.691643385 +0000 UTC m=+22.794095007" watchObservedRunningTime="2026-04-21 15:35:45.69222042 +0000 UTC m=+22.794672041" Apr 21 15:35:47.246045 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:47.246004 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-zptw6" Apr 21 15:35:47.246780 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:47.246704 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-zptw6" Apr 21 15:35:47.542603 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:47.542531 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tknf" Apr 21 15:35:47.542730 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:47.542541 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4vnh5" Apr 21 15:35:47.542730 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:47.542674 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tknf" podUID="0e013988-1283-4f21-89bb-0200deb14502" Apr 21 15:35:47.542842 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:47.542722 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4vnh5" podUID="4e63b458-1e99-4bb3-bb96-259b69e04282" Apr 21 15:35:47.677662 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:47.677627 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" event={"ID":"6d57d7d1-2c2c-4e3f-b32d-771603839fe4","Type":"ContainerStarted","Data":"3d885c4222a034c8fa206bbdfccc85f7381f26536c56cb8cc55b4f0d09ded179"} Apr 21 15:35:47.679241 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:47.679210 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ll86t" event={"ID":"8e625158-0ce1-4766-9988-80be7fb8ed12","Type":"ContainerStarted","Data":"9449cb02f539d15e3ea70296e237f34fc006f175f41803477979e5f12841283a"} Apr 21 15:35:48.682769 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:48.682735 2569 generic.go:358] "Generic (PLEG): container finished" podID="8e625158-0ce1-4766-9988-80be7fb8ed12" containerID="9449cb02f539d15e3ea70296e237f34fc006f175f41803477979e5f12841283a" exitCode=0 Apr 21 15:35:48.683240 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:48.682793 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ll86t" event={"ID":"8e625158-0ce1-4766-9988-80be7fb8ed12","Type":"ContainerDied","Data":"9449cb02f539d15e3ea70296e237f34fc006f175f41803477979e5f12841283a"} Apr 21 15:35:49.542657 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:49.542568 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tknf" Apr 21 15:35:49.542798 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:49.542568 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4vnh5" Apr 21 15:35:49.542798 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:49.542712 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tknf" podUID="0e013988-1283-4f21-89bb-0200deb14502" Apr 21 15:35:49.542798 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:49.542735 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4vnh5" podUID="4e63b458-1e99-4bb3-bb96-259b69e04282" Apr 21 15:35:49.688375 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:49.687253 2569 generic.go:358] "Generic (PLEG): container finished" podID="8e625158-0ce1-4766-9988-80be7fb8ed12" containerID="709720c3bd17ca4b5fc8c6556f11978f4716d9123a4de60d185c4ec5a6077157" exitCode=0 Apr 21 15:35:49.688375 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:49.687336 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ll86t" event={"ID":"8e625158-0ce1-4766-9988-80be7fb8ed12","Type":"ContainerDied","Data":"709720c3bd17ca4b5fc8c6556f11978f4716d9123a4de60d185c4ec5a6077157"} Apr 21 15:35:49.692968 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:49.691033 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" event={"ID":"6d57d7d1-2c2c-4e3f-b32d-771603839fe4","Type":"ContainerStarted","Data":"30062e679215a992ecebb1d32f69f3968bfb8b7f52752095fb9d663c61a0f7e1"} Apr 21 15:35:49.692968 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:49.691348 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:49.692968 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:49.691373 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:49.707455 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:49.706995 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:49.739561 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:49.739517 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" podStartSLOduration=9.356334078 podStartE2EDuration="26.739504526s" podCreationTimestamp="2026-04-21 15:35:23 +0000 UTC" firstStartedPulling="2026-04-21 15:35:26.087356662 +0000 UTC m=+3.189808263" lastFinishedPulling="2026-04-21 15:35:43.470527112 +0000 UTC m=+20.572978711" observedRunningTime="2026-04-21 15:35:49.739066247 +0000 UTC m=+26.841517894" watchObservedRunningTime="2026-04-21 15:35:49.739504526 +0000 UTC m=+26.841956147" Apr 21 15:35:50.695533 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:50.695501 2569 generic.go:358] "Generic (PLEG): container finished" podID="8e625158-0ce1-4766-9988-80be7fb8ed12" containerID="ddb8e479d19a2ecd511caa66fef179ae8b4ffc628d1cc9f1634f93bc718f26d1" exitCode=0 Apr 21 15:35:50.695979 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:50.695590 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ll86t" event={"ID":"8e625158-0ce1-4766-9988-80be7fb8ed12","Type":"ContainerDied","Data":"ddb8e479d19a2ecd511caa66fef179ae8b4ffc628d1cc9f1634f93bc718f26d1"} Apr 21 15:35:50.696089 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:50.696072 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:50.710338 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:50.710303 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:35:51.542328 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:51.542116 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tknf" Apr 21 15:35:51.542485 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:51.542189 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4vnh5" Apr 21 15:35:51.542485 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:51.542443 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tknf" podUID="0e013988-1283-4f21-89bb-0200deb14502" Apr 21 15:35:51.542589 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:51.542488 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4vnh5" podUID="4e63b458-1e99-4bb3-bb96-259b69e04282" Apr 21 15:35:51.585712 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:51.585680 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-4vnh5"] Apr 21 15:35:51.588226 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:51.588204 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8tknf"] Apr 21 15:35:51.697332 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:51.697302 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4vnh5" Apr 21 15:35:51.697753 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:51.697302 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tknf" Apr 21 15:35:51.697753 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:51.697433 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4vnh5" podUID="4e63b458-1e99-4bb3-bb96-259b69e04282" Apr 21 15:35:51.697901 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:51.697877 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tknf" podUID="0e013988-1283-4f21-89bb-0200deb14502" Apr 21 15:35:53.304446 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:53.304415 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-zptw6" Apr 21 15:35:53.304940 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:53.304604 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 15:35:53.305223 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:53.305200 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-zptw6" Apr 21 15:35:53.543062 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:53.543031 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tknf" Apr 21 15:35:53.543248 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:53.543171 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tknf" podUID="0e013988-1283-4f21-89bb-0200deb14502" Apr 21 15:35:53.543248 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:53.543215 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4vnh5" Apr 21 15:35:53.543378 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:53.543326 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4vnh5" podUID="4e63b458-1e99-4bb3-bb96-259b69e04282" Apr 21 15:35:54.698430 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:54.698399 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-158.ec2.internal" event="NodeReady" Apr 21 15:35:54.698946 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:54.698541 2569 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 15:35:54.752941 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:54.752905 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-sg5b7"] Apr 21 15:35:54.785632 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:54.785593 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-rhcxk"] Apr 21 15:35:54.785796 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:54.785772 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sg5b7" Apr 21 15:35:54.788531 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:54.788500 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 15:35:54.788531 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:54.788515 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 15:35:54.797057 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:54.797032 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-8774b\"" Apr 21 15:35:54.809213 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:54.809184 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rhcxk"] Apr 21 15:35:54.809344 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:54.809220 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-sg5b7"] Apr 21 15:35:54.809395 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:54.809335 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rhcxk" Apr 21 15:35:54.811930 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:54.811868 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 15:35:54.812742 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:54.812717 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 15:35:54.812914 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:54.812869 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-l9cdd\"" Apr 21 15:35:54.812994 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:54.812935 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 15:35:54.853407 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:54.853375 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvcqb\" (UniqueName: \"kubernetes.io/projected/60d46684-622f-4c21-be81-9f138a88507d-kube-api-access-gvcqb\") pod \"dns-default-sg5b7\" (UID: \"60d46684-622f-4c21-be81-9f138a88507d\") " pod="openshift-dns/dns-default-sg5b7" Apr 21 15:35:54.853578 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:54.853414 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/60d46684-622f-4c21-be81-9f138a88507d-config-volume\") pod \"dns-default-sg5b7\" (UID: \"60d46684-622f-4c21-be81-9f138a88507d\") " pod="openshift-dns/dns-default-sg5b7" Apr 21 15:35:54.853578 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:54.853445 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/60d46684-622f-4c21-be81-9f138a88507d-tmp-dir\") pod \"dns-default-sg5b7\" (UID: \"60d46684-622f-4c21-be81-9f138a88507d\") " pod="openshift-dns/dns-default-sg5b7" Apr 21 15:35:54.853578 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:54.853499 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/60d46684-622f-4c21-be81-9f138a88507d-metrics-tls\") pod \"dns-default-sg5b7\" (UID: \"60d46684-622f-4c21-be81-9f138a88507d\") " pod="openshift-dns/dns-default-sg5b7" Apr 21 15:35:54.954461 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:54.954423 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvcqb\" (UniqueName: \"kubernetes.io/projected/60d46684-622f-4c21-be81-9f138a88507d-kube-api-access-gvcqb\") pod \"dns-default-sg5b7\" (UID: \"60d46684-622f-4c21-be81-9f138a88507d\") " pod="openshift-dns/dns-default-sg5b7" Apr 21 15:35:54.954608 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:54.954480 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1a9389d-bece-4871-8a89-c3af3238f617-cert\") pod \"ingress-canary-rhcxk\" (UID: \"f1a9389d-bece-4871-8a89-c3af3238f617\") " pod="openshift-ingress-canary/ingress-canary-rhcxk" Apr 21 15:35:54.954608 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:54.954509 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmrjk\" (UniqueName: \"kubernetes.io/projected/f1a9389d-bece-4871-8a89-c3af3238f617-kube-api-access-qmrjk\") pod \"ingress-canary-rhcxk\" (UID: \"f1a9389d-bece-4871-8a89-c3af3238f617\") " pod="openshift-ingress-canary/ingress-canary-rhcxk" Apr 21 15:35:54.954608 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:54.954536 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/60d46684-622f-4c21-be81-9f138a88507d-config-volume\") pod \"dns-default-sg5b7\" (UID: \"60d46684-622f-4c21-be81-9f138a88507d\") " pod="openshift-dns/dns-default-sg5b7" Apr 21 15:35:54.954608 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:54.954594 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/60d46684-622f-4c21-be81-9f138a88507d-tmp-dir\") pod \"dns-default-sg5b7\" (UID: \"60d46684-622f-4c21-be81-9f138a88507d\") " pod="openshift-dns/dns-default-sg5b7" Apr 21 15:35:54.954778 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:54.954635 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/60d46684-622f-4c21-be81-9f138a88507d-metrics-tls\") pod \"dns-default-sg5b7\" (UID: \"60d46684-622f-4c21-be81-9f138a88507d\") " pod="openshift-dns/dns-default-sg5b7" Apr 21 15:35:54.954778 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:54.954744 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:35:54.954847 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:54.954814 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60d46684-622f-4c21-be81-9f138a88507d-metrics-tls podName:60d46684-622f-4c21-be81-9f138a88507d nodeName:}" failed. No retries permitted until 2026-04-21 15:35:55.45479424 +0000 UTC m=+32.557245842 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/60d46684-622f-4c21-be81-9f138a88507d-metrics-tls") pod "dns-default-sg5b7" (UID: "60d46684-622f-4c21-be81-9f138a88507d") : secret "dns-default-metrics-tls" not found Apr 21 15:35:54.955007 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:54.954976 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/60d46684-622f-4c21-be81-9f138a88507d-config-volume\") pod \"dns-default-sg5b7\" (UID: \"60d46684-622f-4c21-be81-9f138a88507d\") " pod="openshift-dns/dns-default-sg5b7" Apr 21 15:35:54.955007 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:54.954977 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/60d46684-622f-4c21-be81-9f138a88507d-tmp-dir\") pod \"dns-default-sg5b7\" (UID: \"60d46684-622f-4c21-be81-9f138a88507d\") " pod="openshift-dns/dns-default-sg5b7" Apr 21 15:35:54.972427 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:54.972394 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvcqb\" (UniqueName: \"kubernetes.io/projected/60d46684-622f-4c21-be81-9f138a88507d-kube-api-access-gvcqb\") pod \"dns-default-sg5b7\" (UID: \"60d46684-622f-4c21-be81-9f138a88507d\") " pod="openshift-dns/dns-default-sg5b7" Apr 21 15:35:55.055640 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:55.055596 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1a9389d-bece-4871-8a89-c3af3238f617-cert\") pod \"ingress-canary-rhcxk\" (UID: \"f1a9389d-bece-4871-8a89-c3af3238f617\") " pod="openshift-ingress-canary/ingress-canary-rhcxk" Apr 21 15:35:55.055823 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:55.055647 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qmrjk\" (UniqueName: \"kubernetes.io/projected/f1a9389d-bece-4871-8a89-c3af3238f617-kube-api-access-qmrjk\") pod \"ingress-canary-rhcxk\" (UID: \"f1a9389d-bece-4871-8a89-c3af3238f617\") " pod="openshift-ingress-canary/ingress-canary-rhcxk" Apr 21 15:35:55.055823 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:55.055754 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:35:55.055917 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:55.055826 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1a9389d-bece-4871-8a89-c3af3238f617-cert podName:f1a9389d-bece-4871-8a89-c3af3238f617 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:55.555807873 +0000 UTC m=+32.658259484 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f1a9389d-bece-4871-8a89-c3af3238f617-cert") pod "ingress-canary-rhcxk" (UID: "f1a9389d-bece-4871-8a89-c3af3238f617") : secret "canary-serving-cert" not found Apr 21 15:35:55.072065 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:55.072029 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmrjk\" (UniqueName: \"kubernetes.io/projected/f1a9389d-bece-4871-8a89-c3af3238f617-kube-api-access-qmrjk\") pod \"ingress-canary-rhcxk\" (UID: \"f1a9389d-bece-4871-8a89-c3af3238f617\") " pod="openshift-ingress-canary/ingress-canary-rhcxk" Apr 21 15:35:55.458411 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:55.458371 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/60d46684-622f-4c21-be81-9f138a88507d-metrics-tls\") pod \"dns-default-sg5b7\" (UID: \"60d46684-622f-4c21-be81-9f138a88507d\") " pod="openshift-dns/dns-default-sg5b7" Apr 21 15:35:55.458608 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:55.458548 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:35:55.458669 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:55.458630 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60d46684-622f-4c21-be81-9f138a88507d-metrics-tls podName:60d46684-622f-4c21-be81-9f138a88507d nodeName:}" failed. No retries permitted until 2026-04-21 15:35:56.458608509 +0000 UTC m=+33.561060125 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/60d46684-622f-4c21-be81-9f138a88507d-metrics-tls") pod "dns-default-sg5b7" (UID: "60d46684-622f-4c21-be81-9f138a88507d") : secret "dns-default-metrics-tls" not found Apr 21 15:35:55.542548 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:55.542506 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tknf" Apr 21 15:35:55.542548 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:55.542533 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4vnh5" Apr 21 15:35:55.545589 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:55.545562 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 15:35:55.545693 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:55.545667 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-dzs5m\"" Apr 21 15:35:55.546321 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:55.546299 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 15:35:55.546577 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:55.546556 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 15:35:55.548733 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:55.548710 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-l72g6\"" Apr 21 15:35:55.558919 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:55.558896 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1a9389d-bece-4871-8a89-c3af3238f617-cert\") pod \"ingress-canary-rhcxk\" (UID: \"f1a9389d-bece-4871-8a89-c3af3238f617\") " pod="openshift-ingress-canary/ingress-canary-rhcxk" Apr 21 15:35:55.559068 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:55.559047 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:35:55.559188 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:55.559117 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1a9389d-bece-4871-8a89-c3af3238f617-cert podName:f1a9389d-bece-4871-8a89-c3af3238f617 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:56.559098307 +0000 UTC m=+33.661549929 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f1a9389d-bece-4871-8a89-c3af3238f617-cert") pod "ingress-canary-rhcxk" (UID: "f1a9389d-bece-4871-8a89-c3af3238f617") : secret "canary-serving-cert" not found Apr 21 15:35:55.698493 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:55.698458 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-87c8v"] Apr 21 15:35:55.728338 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:55.728261 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-87c8v"] Apr 21 15:35:55.728481 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:55.728431 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-87c8v" Apr 21 15:35:55.731300 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:55.731273 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 21 15:35:55.732308 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:55.732289 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 21 15:35:55.732652 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:55.732633 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 21 15:35:55.732748 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:55.732678 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 21 15:35:55.732891 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:55.732871 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-fzxm5\"" Apr 21 15:35:55.861863 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:55.861823 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2e06b3e4-a71d-4809-ab4f-26e9f064f3ac-signing-key\") pod \"service-ca-865cb79987-87c8v\" (UID: \"2e06b3e4-a71d-4809-ab4f-26e9f064f3ac\") " pod="openshift-service-ca/service-ca-865cb79987-87c8v" Apr 21 15:35:55.862019 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:55.861871 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk9sd\" (UniqueName: \"kubernetes.io/projected/2e06b3e4-a71d-4809-ab4f-26e9f064f3ac-kube-api-access-hk9sd\") pod \"service-ca-865cb79987-87c8v\" (UID: \"2e06b3e4-a71d-4809-ab4f-26e9f064f3ac\") " pod="openshift-service-ca/service-ca-865cb79987-87c8v" Apr 21 15:35:55.862019 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:55.862011 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2e06b3e4-a71d-4809-ab4f-26e9f064f3ac-signing-cabundle\") pod \"service-ca-865cb79987-87c8v\" (UID: \"2e06b3e4-a71d-4809-ab4f-26e9f064f3ac\") " pod="openshift-service-ca/service-ca-865cb79987-87c8v" Apr 21 15:35:55.962522 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:55.962481 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2e06b3e4-a71d-4809-ab4f-26e9f064f3ac-signing-cabundle\") pod \"service-ca-865cb79987-87c8v\" (UID: \"2e06b3e4-a71d-4809-ab4f-26e9f064f3ac\") " pod="openshift-service-ca/service-ca-865cb79987-87c8v" Apr 21 15:35:55.962522 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:55.962539 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2e06b3e4-a71d-4809-ab4f-26e9f064f3ac-signing-key\") pod \"service-ca-865cb79987-87c8v\" (UID: \"2e06b3e4-a71d-4809-ab4f-26e9f064f3ac\") " pod="openshift-service-ca/service-ca-865cb79987-87c8v" Apr 21 15:35:55.962765 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:55.962673 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hk9sd\" (UniqueName: \"kubernetes.io/projected/2e06b3e4-a71d-4809-ab4f-26e9f064f3ac-kube-api-access-hk9sd\") pod \"service-ca-865cb79987-87c8v\" (UID: \"2e06b3e4-a71d-4809-ab4f-26e9f064f3ac\") " pod="openshift-service-ca/service-ca-865cb79987-87c8v" Apr 21 15:35:55.963308 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:55.963275 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2e06b3e4-a71d-4809-ab4f-26e9f064f3ac-signing-cabundle\") pod \"service-ca-865cb79987-87c8v\" (UID: \"2e06b3e4-a71d-4809-ab4f-26e9f064f3ac\") " pod="openshift-service-ca/service-ca-865cb79987-87c8v" Apr 21 15:35:55.965388 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:55.965357 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2e06b3e4-a71d-4809-ab4f-26e9f064f3ac-signing-key\") pod \"service-ca-865cb79987-87c8v\" (UID: \"2e06b3e4-a71d-4809-ab4f-26e9f064f3ac\") " pod="openshift-service-ca/service-ca-865cb79987-87c8v" Apr 21 15:35:55.979504 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:55.979442 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk9sd\" (UniqueName: \"kubernetes.io/projected/2e06b3e4-a71d-4809-ab4f-26e9f064f3ac-kube-api-access-hk9sd\") pod \"service-ca-865cb79987-87c8v\" (UID: \"2e06b3e4-a71d-4809-ab4f-26e9f064f3ac\") " pod="openshift-service-ca/service-ca-865cb79987-87c8v" Apr 21 15:35:56.039403 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:56.039365 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-87c8v" Apr 21 15:35:56.463272 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:56.463077 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-87c8v"] Apr 21 15:35:56.466421 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:56.466399 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/60d46684-622f-4c21-be81-9f138a88507d-metrics-tls\") pod \"dns-default-sg5b7\" (UID: \"60d46684-622f-4c21-be81-9f138a88507d\") " pod="openshift-dns/dns-default-sg5b7" Apr 21 15:35:56.466554 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:56.466540 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:35:56.466607 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:56.466598 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60d46684-622f-4c21-be81-9f138a88507d-metrics-tls podName:60d46684-622f-4c21-be81-9f138a88507d nodeName:}" failed. No retries permitted until 2026-04-21 15:35:58.466582951 +0000 UTC m=+35.569034563 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/60d46684-622f-4c21-be81-9f138a88507d-metrics-tls") pod "dns-default-sg5b7" (UID: "60d46684-622f-4c21-be81-9f138a88507d") : secret "dns-default-metrics-tls" not found Apr 21 15:35:56.567793 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:56.567687 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1a9389d-bece-4871-8a89-c3af3238f617-cert\") pod \"ingress-canary-rhcxk\" (UID: \"f1a9389d-bece-4871-8a89-c3af3238f617\") " pod="openshift-ingress-canary/ingress-canary-rhcxk" Apr 21 15:35:56.567926 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:56.567814 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:35:56.567926 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:56.567870 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1a9389d-bece-4871-8a89-c3af3238f617-cert podName:f1a9389d-bece-4871-8a89-c3af3238f617 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:58.567854156 +0000 UTC m=+35.670305760 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f1a9389d-bece-4871-8a89-c3af3238f617-cert") pod "ingress-canary-rhcxk" (UID: "f1a9389d-bece-4871-8a89-c3af3238f617") : secret "canary-serving-cert" not found Apr 21 15:35:56.647422 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:56.647384 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e06b3e4_a71d_4809_ab4f_26e9f064f3ac.slice/crio-1410f5305072d0412e214a4ab0dfc501196de65e99c38ac587ce2eb1e49682ab WatchSource:0}: Error finding container 1410f5305072d0412e214a4ab0dfc501196de65e99c38ac587ce2eb1e49682ab: Status 404 returned error can't find the container with id 1410f5305072d0412e214a4ab0dfc501196de65e99c38ac587ce2eb1e49682ab Apr 21 15:35:56.707434 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:56.707402 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-87c8v" event={"ID":"2e06b3e4-a71d-4809-ab4f-26e9f064f3ac","Type":"ContainerStarted","Data":"1410f5305072d0412e214a4ab0dfc501196de65e99c38ac587ce2eb1e49682ab"} Apr 21 15:35:57.173144 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:57.173100 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e013988-1283-4f21-89bb-0200deb14502-metrics-certs\") pod \"network-metrics-daemon-8tknf\" (UID: \"0e013988-1283-4f21-89bb-0200deb14502\") " pod="openshift-multus/network-metrics-daemon-8tknf" Apr 21 15:35:57.173320 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:57.173246 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 15:35:57.173320 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:57.173308 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e013988-1283-4f21-89bb-0200deb14502-metrics-certs podName:0e013988-1283-4f21-89bb-0200deb14502 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:29.173292173 +0000 UTC m=+66.275743792 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0e013988-1283-4f21-89bb-0200deb14502-metrics-certs") pod "network-metrics-daemon-8tknf" (UID: "0e013988-1283-4f21-89bb-0200deb14502") : secret "metrics-daemon-secret" not found Apr 21 15:35:57.273986 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:57.273952 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9qf9\" (UniqueName: \"kubernetes.io/projected/4e63b458-1e99-4bb3-bb96-259b69e04282-kube-api-access-c9qf9\") pod \"network-check-target-4vnh5\" (UID: \"4e63b458-1e99-4bb3-bb96-259b69e04282\") " pod="openshift-network-diagnostics/network-check-target-4vnh5" Apr 21 15:35:57.278166 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:57.278122 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9qf9\" (UniqueName: \"kubernetes.io/projected/4e63b458-1e99-4bb3-bb96-259b69e04282-kube-api-access-c9qf9\") pod \"network-check-target-4vnh5\" (UID: \"4e63b458-1e99-4bb3-bb96-259b69e04282\") " pod="openshift-network-diagnostics/network-check-target-4vnh5" Apr 21 15:35:57.359763 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:57.359734 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4vnh5" Apr 21 15:35:57.514911 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:57.514882 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-4vnh5"] Apr 21 15:35:57.518981 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:35:57.518946 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e63b458_1e99_4bb3_bb96_259b69e04282.slice/crio-ee76f8834d1219e92f57a027ce103c89e6c057e79e7bca885e8d993be07d8abf WatchSource:0}: Error finding container ee76f8834d1219e92f57a027ce103c89e6c057e79e7bca885e8d993be07d8abf: Status 404 returned error can't find the container with id ee76f8834d1219e92f57a027ce103c89e6c057e79e7bca885e8d993be07d8abf Apr 21 15:35:57.712998 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:57.712902 2569 generic.go:358] "Generic (PLEG): container finished" podID="8e625158-0ce1-4766-9988-80be7fb8ed12" containerID="c7b769fc767b9f0497c5b56f8d8b362b4a6392c8b78b7a0e61eed2a8ad401f21" exitCode=0 Apr 21 15:35:57.713534 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:57.712996 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ll86t" event={"ID":"8e625158-0ce1-4766-9988-80be7fb8ed12","Type":"ContainerDied","Data":"c7b769fc767b9f0497c5b56f8d8b362b4a6392c8b78b7a0e61eed2a8ad401f21"} Apr 21 15:35:57.714977 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:57.714943 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-4vnh5" event={"ID":"4e63b458-1e99-4bb3-bb96-259b69e04282","Type":"ContainerStarted","Data":"ee76f8834d1219e92f57a027ce103c89e6c057e79e7bca885e8d993be07d8abf"} Apr 21 15:35:58.483815 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:58.483529 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/60d46684-622f-4c21-be81-9f138a88507d-metrics-tls\") pod \"dns-default-sg5b7\" (UID: \"60d46684-622f-4c21-be81-9f138a88507d\") " pod="openshift-dns/dns-default-sg5b7" Apr 21 15:35:58.483815 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:58.483664 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:35:58.483815 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:58.483712 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60d46684-622f-4c21-be81-9f138a88507d-metrics-tls podName:60d46684-622f-4c21-be81-9f138a88507d nodeName:}" failed. No retries permitted until 2026-04-21 15:36:02.483699588 +0000 UTC m=+39.586151186 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/60d46684-622f-4c21-be81-9f138a88507d-metrics-tls") pod "dns-default-sg5b7" (UID: "60d46684-622f-4c21-be81-9f138a88507d") : secret "dns-default-metrics-tls" not found Apr 21 15:35:58.584905 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:58.584851 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1a9389d-bece-4871-8a89-c3af3238f617-cert\") pod \"ingress-canary-rhcxk\" (UID: \"f1a9389d-bece-4871-8a89-c3af3238f617\") " pod="openshift-ingress-canary/ingress-canary-rhcxk" Apr 21 15:35:58.585050 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:58.585024 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:35:58.585117 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:35:58.585106 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1a9389d-bece-4871-8a89-c3af3238f617-cert podName:f1a9389d-bece-4871-8a89-c3af3238f617 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:02.585085061 +0000 UTC m=+39.687536680 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f1a9389d-bece-4871-8a89-c3af3238f617-cert") pod "ingress-canary-rhcxk" (UID: "f1a9389d-bece-4871-8a89-c3af3238f617") : secret "canary-serving-cert" not found Apr 21 15:35:58.720543 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:58.720512 2569 generic.go:358] "Generic (PLEG): container finished" podID="8e625158-0ce1-4766-9988-80be7fb8ed12" containerID="59ee258c6e4abc71900a64fa4deaa924801918b12fb01eae9ff2a06f2f01bfdd" exitCode=0 Apr 21 15:35:58.720938 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:58.720556 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ll86t" event={"ID":"8e625158-0ce1-4766-9988-80be7fb8ed12","Type":"ContainerDied","Data":"59ee258c6e4abc71900a64fa4deaa924801918b12fb01eae9ff2a06f2f01bfdd"} Apr 21 15:35:59.727003 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:59.726944 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ll86t" event={"ID":"8e625158-0ce1-4766-9988-80be7fb8ed12","Type":"ContainerStarted","Data":"37de2bc47af7e0c62b807073682d98c82c2b334e9794eac9325e9c46dbd73df3"} Apr 21 15:35:59.728471 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:59.728442 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-87c8v" event={"ID":"2e06b3e4-a71d-4809-ab4f-26e9f064f3ac","Type":"ContainerStarted","Data":"f662668e62970ad8174f12d2e78375e77d50790ea829d8753494147536995999"} Apr 21 15:35:59.762004 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:59.761935 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-ll86t" podStartSLOduration=6.147036055 podStartE2EDuration="36.761916997s" podCreationTimestamp="2026-04-21 15:35:23 +0000 UTC" firstStartedPulling="2026-04-21 15:35:26.075830839 +0000 UTC m=+3.178282441" lastFinishedPulling="2026-04-21 15:35:56.690711781 +0000 UTC m=+33.793163383" observedRunningTime="2026-04-21 15:35:59.760465374 +0000 UTC m=+36.862917032" watchObservedRunningTime="2026-04-21 15:35:59.761916997 +0000 UTC m=+36.864368650" Apr 21 15:35:59.791117 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:35:59.791050 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-87c8v" podStartSLOduration=2.649116143 podStartE2EDuration="4.791027915s" podCreationTimestamp="2026-04-21 15:35:55 +0000 UTC" firstStartedPulling="2026-04-21 15:35:56.670683002 +0000 UTC m=+33.773134601" lastFinishedPulling="2026-04-21 15:35:58.812594759 +0000 UTC m=+35.915046373" observedRunningTime="2026-04-21 15:35:59.789880848 +0000 UTC m=+36.892332495" watchObservedRunningTime="2026-04-21 15:35:59.791027915 +0000 UTC m=+36.893479538" Apr 21 15:36:01.733400 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:01.733359 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-4vnh5" event={"ID":"4e63b458-1e99-4bb3-bb96-259b69e04282","Type":"ContainerStarted","Data":"d91b922e134abbcf9fe1b81b0c57a0753fd6b050e080266eb02ac0340e5f64bf"} Apr 21 15:36:01.733981 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:01.733492 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-4vnh5" Apr 21 15:36:01.752395 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:01.751901 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-4vnh5" podStartSLOduration=35.213447854 podStartE2EDuration="38.751882325s" podCreationTimestamp="2026-04-21 15:35:23 +0000 UTC" firstStartedPulling="2026-04-21 15:35:57.52136331 +0000 UTC m=+34.623814909" lastFinishedPulling="2026-04-21 15:36:01.05979778 +0000 UTC m=+38.162249380" observedRunningTime="2026-04-21 15:36:01.751656317 +0000 UTC m=+38.854107937" watchObservedRunningTime="2026-04-21 15:36:01.751882325 +0000 UTC m=+38.854333948" Apr 21 15:36:02.518077 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:02.518033 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/60d46684-622f-4c21-be81-9f138a88507d-metrics-tls\") pod \"dns-default-sg5b7\" (UID: \"60d46684-622f-4c21-be81-9f138a88507d\") " pod="openshift-dns/dns-default-sg5b7" Apr 21 15:36:02.518309 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:36:02.518221 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:36:02.518309 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:36:02.518305 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60d46684-622f-4c21-be81-9f138a88507d-metrics-tls podName:60d46684-622f-4c21-be81-9f138a88507d nodeName:}" failed. No retries permitted until 2026-04-21 15:36:10.518284404 +0000 UTC m=+47.620736007 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/60d46684-622f-4c21-be81-9f138a88507d-metrics-tls") pod "dns-default-sg5b7" (UID: "60d46684-622f-4c21-be81-9f138a88507d") : secret "dns-default-metrics-tls" not found Apr 21 15:36:02.618686 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:02.618645 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1a9389d-bece-4871-8a89-c3af3238f617-cert\") pod \"ingress-canary-rhcxk\" (UID: \"f1a9389d-bece-4871-8a89-c3af3238f617\") " pod="openshift-ingress-canary/ingress-canary-rhcxk" Apr 21 15:36:02.618900 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:36:02.618813 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:36:02.618900 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:36:02.618880 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1a9389d-bece-4871-8a89-c3af3238f617-cert podName:f1a9389d-bece-4871-8a89-c3af3238f617 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:10.618865181 +0000 UTC m=+47.721316780 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f1a9389d-bece-4871-8a89-c3af3238f617-cert") pod "ingress-canary-rhcxk" (UID: "f1a9389d-bece-4871-8a89-c3af3238f617") : secret "canary-serving-cert" not found Apr 21 15:36:10.573772 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:10.573731 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/60d46684-622f-4c21-be81-9f138a88507d-metrics-tls\") pod \"dns-default-sg5b7\" (UID: \"60d46684-622f-4c21-be81-9f138a88507d\") " pod="openshift-dns/dns-default-sg5b7" Apr 21 15:36:10.577160 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:10.577119 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/60d46684-622f-4c21-be81-9f138a88507d-metrics-tls\") pod \"dns-default-sg5b7\" (UID: \"60d46684-622f-4c21-be81-9f138a88507d\") " pod="openshift-dns/dns-default-sg5b7" Apr 21 15:36:10.674396 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:10.674357 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1a9389d-bece-4871-8a89-c3af3238f617-cert\") pod \"ingress-canary-rhcxk\" (UID: \"f1a9389d-bece-4871-8a89-c3af3238f617\") " pod="openshift-ingress-canary/ingress-canary-rhcxk" Apr 21 15:36:10.676829 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:10.676798 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1a9389d-bece-4871-8a89-c3af3238f617-cert\") pod \"ingress-canary-rhcxk\" (UID: \"f1a9389d-bece-4871-8a89-c3af3238f617\") " pod="openshift-ingress-canary/ingress-canary-rhcxk" Apr 21 15:36:10.697116 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:10.697077 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sg5b7" Apr 21 15:36:10.720320 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:10.720290 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rhcxk" Apr 21 15:36:10.848298 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:10.848223 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-sg5b7"] Apr 21 15:36:10.851906 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:36:10.851877 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60d46684_622f_4c21_be81_9f138a88507d.slice/crio-150f79ec6c9632b98397f71c3d05ab9f779a108cccd52431b7c72123be342713 WatchSource:0}: Error finding container 150f79ec6c9632b98397f71c3d05ab9f779a108cccd52431b7c72123be342713: Status 404 returned error can't find the container with id 150f79ec6c9632b98397f71c3d05ab9f779a108cccd52431b7c72123be342713 Apr 21 15:36:10.878615 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:10.878543 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rhcxk"] Apr 21 15:36:10.886808 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:36:10.886777 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1a9389d_bece_4871_8a89_c3af3238f617.slice/crio-a387c83a2ddb3f11eeb85c91f00b1f794ebb55fc733eb6931b9a6d5945e84814 WatchSource:0}: Error finding container a387c83a2ddb3f11eeb85c91f00b1f794ebb55fc733eb6931b9a6d5945e84814: Status 404 returned error can't find the container with id a387c83a2ddb3f11eeb85c91f00b1f794ebb55fc733eb6931b9a6d5945e84814 Apr 21 15:36:11.760696 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:11.760661 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rhcxk" event={"ID":"f1a9389d-bece-4871-8a89-c3af3238f617","Type":"ContainerStarted","Data":"a387c83a2ddb3f11eeb85c91f00b1f794ebb55fc733eb6931b9a6d5945e84814"} Apr 21 15:36:11.762004 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:11.761969 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sg5b7" event={"ID":"60d46684-622f-4c21-be81-9f138a88507d","Type":"ContainerStarted","Data":"150f79ec6c9632b98397f71c3d05ab9f779a108cccd52431b7c72123be342713"} Apr 21 15:36:13.767802 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:13.767696 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sg5b7" event={"ID":"60d46684-622f-4c21-be81-9f138a88507d","Type":"ContainerStarted","Data":"5e0f08c3b2e30e676b68ef5b2ef5f1ddcc61c874b9cac8057c52bec90a70a6e4"} Apr 21 15:36:13.767802 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:13.767747 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sg5b7" event={"ID":"60d46684-622f-4c21-be81-9f138a88507d","Type":"ContainerStarted","Data":"8280548fd2b7688f4a60243def86981d4bf57e7a4e9090953b6aa738e03a2082"} Apr 21 15:36:13.768287 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:13.767826 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-sg5b7" Apr 21 15:36:13.768981 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:13.768955 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rhcxk" event={"ID":"f1a9389d-bece-4871-8a89-c3af3238f617","Type":"ContainerStarted","Data":"f5865737e218503d0fdda7b67cdf97c147d58c1579f16ae0ff71ba6d688ddbf6"} Apr 21 15:36:13.792894 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:13.792844 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-sg5b7" podStartSLOduration=17.372164631 podStartE2EDuration="19.792831065s" podCreationTimestamp="2026-04-21 15:35:54 +0000 UTC" firstStartedPulling="2026-04-21 15:36:10.853621018 +0000 UTC m=+47.956072617" lastFinishedPulling="2026-04-21 15:36:13.27428743 +0000 UTC m=+50.376739051" observedRunningTime="2026-04-21 15:36:13.792179196 +0000 UTC m=+50.894630816" watchObservedRunningTime="2026-04-21 15:36:13.792831065 +0000 UTC m=+50.895282686" Apr 21 15:36:13.813227 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:13.813176 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-rhcxk" podStartSLOduration=17.422493839 podStartE2EDuration="19.813163543s" podCreationTimestamp="2026-04-21 15:35:54 +0000 UTC" firstStartedPulling="2026-04-21 15:36:10.88871527 +0000 UTC m=+47.991166869" lastFinishedPulling="2026-04-21 15:36:13.279384959 +0000 UTC m=+50.381836573" observedRunningTime="2026-04-21 15:36:13.813027898 +0000 UTC m=+50.915479520" watchObservedRunningTime="2026-04-21 15:36:13.813163543 +0000 UTC m=+50.915615163" Apr 21 15:36:19.500658 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.500617 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78d6c5c987-9wbsf"] Apr 21 15:36:19.528375 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.528347 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-b6cbdfd85-285k8"] Apr 21 15:36:19.528525 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.528492 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78d6c5c987-9wbsf" Apr 21 15:36:19.530998 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.530952 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 21 15:36:19.532008 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.531980 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 21 15:36:19.532151 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.532050 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 21 15:36:19.532151 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.532049 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 21 15:36:19.532276 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.532177 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-6vdhs\"" Apr 21 15:36:19.548321 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.548294 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78d6c5c987-9wbsf"] Apr 21 15:36:19.548460 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.548440 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b6cbdfd85-285k8" Apr 21 15:36:19.550837 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.550812 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 21 15:36:19.551551 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.551533 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-b6cbdfd85-285k8"] Apr 21 15:36:19.612168 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.612118 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-hdmgw"] Apr 21 15:36:19.635049 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.635021 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-58d658548d-g2475"] Apr 21 15:36:19.635222 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.635186 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-hdmgw" Apr 21 15:36:19.637568 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.637544 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 21 15:36:19.637716 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.637645 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-6tv55\"" Apr 21 15:36:19.637716 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.637689 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 21 15:36:19.639040 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.639018 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4221f170-520e-4ce0-b5f7-12915f59c78f-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-78d6c5c987-9wbsf\" (UID: \"4221f170-520e-4ce0-b5f7-12915f59c78f\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78d6c5c987-9wbsf" Apr 21 15:36:19.639183 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.639066 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhrqm\" (UniqueName: \"kubernetes.io/projected/4221f170-520e-4ce0-b5f7-12915f59c78f-kube-api-access-nhrqm\") pod \"managed-serviceaccount-addon-agent-78d6c5c987-9wbsf\" (UID: \"4221f170-520e-4ce0-b5f7-12915f59c78f\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78d6c5c987-9wbsf" Apr 21 15:36:19.648482 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.648457 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-hdmgw"] Apr 21 15:36:19.648612 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.648566 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-58d658548d-g2475" Apr 21 15:36:19.650745 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.650720 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-wt6h2\"" Apr 21 15:36:19.650872 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.650750 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 21 15:36:19.650872 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.650728 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 21 15:36:19.650993 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.650972 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 21 15:36:19.651068 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.651003 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 21 15:36:19.651534 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.651510 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 21 15:36:19.651578 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.651540 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 21 15:36:19.655333 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.655309 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-58d658548d-g2475"] Apr 21 15:36:19.722624 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.722585 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-789665c8f5-2khmb"] Apr 21 15:36:19.739506 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.739470 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v88lw\" (UniqueName: \"kubernetes.io/projected/6dcae2ee-9f9f-4a46-a011-48c9d12f8a3c-kube-api-access-v88lw\") pod \"klusterlet-addon-workmgr-b6cbdfd85-285k8\" (UID: \"6dcae2ee-9f9f-4a46-a011-48c9d12f8a3c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b6cbdfd85-285k8" Apr 21 15:36:19.739691 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.739527 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhrqm\" (UniqueName: \"kubernetes.io/projected/4221f170-520e-4ce0-b5f7-12915f59c78f-kube-api-access-nhrqm\") pod \"managed-serviceaccount-addon-agent-78d6c5c987-9wbsf\" (UID: \"4221f170-520e-4ce0-b5f7-12915f59c78f\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78d6c5c987-9wbsf" Apr 21 15:36:19.739691 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.739559 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g456\" (UniqueName: \"kubernetes.io/projected/f12a02e9-41ef-4e4b-913a-8246dfcb282b-kube-api-access-9g456\") pod \"downloads-6bcc868b7-hdmgw\" (UID: \"f12a02e9-41ef-4e4b-913a-8246dfcb282b\") " pod="openshift-console/downloads-6bcc868b7-hdmgw" Apr 21 15:36:19.739691 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.739581 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/6dcae2ee-9f9f-4a46-a011-48c9d12f8a3c-klusterlet-config\") pod \"klusterlet-addon-workmgr-b6cbdfd85-285k8\" (UID: \"6dcae2ee-9f9f-4a46-a011-48c9d12f8a3c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b6cbdfd85-285k8" Apr 21 15:36:19.739691 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.739672 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6dcae2ee-9f9f-4a46-a011-48c9d12f8a3c-tmp\") pod \"klusterlet-addon-workmgr-b6cbdfd85-285k8\" (UID: \"6dcae2ee-9f9f-4a46-a011-48c9d12f8a3c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b6cbdfd85-285k8" Apr 21 15:36:19.739889 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.739735 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4221f170-520e-4ce0-b5f7-12915f59c78f-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-78d6c5c987-9wbsf\" (UID: \"4221f170-520e-4ce0-b5f7-12915f59c78f\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78d6c5c987-9wbsf" Apr 21 15:36:19.742244 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.742219 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4221f170-520e-4ce0-b5f7-12915f59c78f-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-78d6c5c987-9wbsf\" (UID: \"4221f170-520e-4ce0-b5f7-12915f59c78f\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78d6c5c987-9wbsf" Apr 21 15:36:19.750293 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.750263 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-789665c8f5-2khmb"] Apr 21 15:36:19.750420 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.750396 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-789665c8f5-2khmb" Apr 21 15:36:19.750638 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.750583 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhrqm\" (UniqueName: \"kubernetes.io/projected/4221f170-520e-4ce0-b5f7-12915f59c78f-kube-api-access-nhrqm\") pod \"managed-serviceaccount-addon-agent-78d6c5c987-9wbsf\" (UID: \"4221f170-520e-4ce0-b5f7-12915f59c78f\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78d6c5c987-9wbsf" Apr 21 15:36:19.752739 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.752684 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 15:36:19.752739 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.752727 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 15:36:19.753167 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.753148 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 15:36:19.753267 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.753252 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-tfsv8\"" Apr 21 15:36:19.759508 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.759489 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 15:36:19.840115 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.840073 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/746f5baf-fbf8-4ff6-8fb9-0a3263bd7dc6-stats-auth\") pod \"router-default-58d658548d-g2475\" (UID: \"746f5baf-fbf8-4ff6-8fb9-0a3263bd7dc6\") " pod="openshift-ingress/router-default-58d658548d-g2475" Apr 21 15:36:19.840115 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.840120 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9g456\" (UniqueName: \"kubernetes.io/projected/f12a02e9-41ef-4e4b-913a-8246dfcb282b-kube-api-access-9g456\") pod \"downloads-6bcc868b7-hdmgw\" (UID: \"f12a02e9-41ef-4e4b-913a-8246dfcb282b\") " pod="openshift-console/downloads-6bcc868b7-hdmgw" Apr 21 15:36:19.840363 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.840157 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/6dcae2ee-9f9f-4a46-a011-48c9d12f8a3c-klusterlet-config\") pod \"klusterlet-addon-workmgr-b6cbdfd85-285k8\" (UID: \"6dcae2ee-9f9f-4a46-a011-48c9d12f8a3c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b6cbdfd85-285k8" Apr 21 15:36:19.840363 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.840214 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/746f5baf-fbf8-4ff6-8fb9-0a3263bd7dc6-service-ca-bundle\") pod \"router-default-58d658548d-g2475\" (UID: \"746f5baf-fbf8-4ff6-8fb9-0a3263bd7dc6\") " pod="openshift-ingress/router-default-58d658548d-g2475" Apr 21 15:36:19.840363 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.840252 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/746f5baf-fbf8-4ff6-8fb9-0a3263bd7dc6-default-certificate\") pod \"router-default-58d658548d-g2475\" (UID: \"746f5baf-fbf8-4ff6-8fb9-0a3263bd7dc6\") " pod="openshift-ingress/router-default-58d658548d-g2475" Apr 21 15:36:19.840363 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.840301 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/746f5baf-fbf8-4ff6-8fb9-0a3263bd7dc6-metrics-certs\") pod \"router-default-58d658548d-g2475\" (UID: \"746f5baf-fbf8-4ff6-8fb9-0a3263bd7dc6\") " pod="openshift-ingress/router-default-58d658548d-g2475" Apr 21 15:36:19.840363 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.840330 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6dcae2ee-9f9f-4a46-a011-48c9d12f8a3c-tmp\") pod \"klusterlet-addon-workmgr-b6cbdfd85-285k8\" (UID: \"6dcae2ee-9f9f-4a46-a011-48c9d12f8a3c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b6cbdfd85-285k8" Apr 21 15:36:19.840555 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.840388 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jhmv\" (UniqueName: \"kubernetes.io/projected/746f5baf-fbf8-4ff6-8fb9-0a3263bd7dc6-kube-api-access-5jhmv\") pod \"router-default-58d658548d-g2475\" (UID: \"746f5baf-fbf8-4ff6-8fb9-0a3263bd7dc6\") " pod="openshift-ingress/router-default-58d658548d-g2475" Apr 21 15:36:19.840555 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.840438 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v88lw\" (UniqueName: \"kubernetes.io/projected/6dcae2ee-9f9f-4a46-a011-48c9d12f8a3c-kube-api-access-v88lw\") pod \"klusterlet-addon-workmgr-b6cbdfd85-285k8\" (UID: \"6dcae2ee-9f9f-4a46-a011-48c9d12f8a3c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b6cbdfd85-285k8" Apr 21 15:36:19.845419 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.845394 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78d6c5c987-9wbsf" Apr 21 15:36:19.851428 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.851402 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g456\" (UniqueName: \"kubernetes.io/projected/f12a02e9-41ef-4e4b-913a-8246dfcb282b-kube-api-access-9g456\") pod \"downloads-6bcc868b7-hdmgw\" (UID: \"f12a02e9-41ef-4e4b-913a-8246dfcb282b\") " pod="openshift-console/downloads-6bcc868b7-hdmgw" Apr 21 15:36:19.854530 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.854503 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6dcae2ee-9f9f-4a46-a011-48c9d12f8a3c-tmp\") pod \"klusterlet-addon-workmgr-b6cbdfd85-285k8\" (UID: \"6dcae2ee-9f9f-4a46-a011-48c9d12f8a3c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b6cbdfd85-285k8" Apr 21 15:36:19.854840 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.854796 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v88lw\" (UniqueName: \"kubernetes.io/projected/6dcae2ee-9f9f-4a46-a011-48c9d12f8a3c-kube-api-access-v88lw\") pod \"klusterlet-addon-workmgr-b6cbdfd85-285k8\" (UID: \"6dcae2ee-9f9f-4a46-a011-48c9d12f8a3c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b6cbdfd85-285k8" Apr 21 15:36:19.854914 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.854843 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/6dcae2ee-9f9f-4a46-a011-48c9d12f8a3c-klusterlet-config\") pod \"klusterlet-addon-workmgr-b6cbdfd85-285k8\" (UID: \"6dcae2ee-9f9f-4a46-a011-48c9d12f8a3c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b6cbdfd85-285k8" Apr 21 15:36:19.857311 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.857288 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b6cbdfd85-285k8" Apr 21 15:36:19.940958 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.940909 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/94ced3d3-36cb-4c6f-8871-7381aaf033c2-installation-pull-secrets\") pod \"image-registry-789665c8f5-2khmb\" (UID: \"94ced3d3-36cb-4c6f-8871-7381aaf033c2\") " pod="openshift-image-registry/image-registry-789665c8f5-2khmb" Apr 21 15:36:19.941146 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.940978 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/746f5baf-fbf8-4ff6-8fb9-0a3263bd7dc6-service-ca-bundle\") pod \"router-default-58d658548d-g2475\" (UID: \"746f5baf-fbf8-4ff6-8fb9-0a3263bd7dc6\") " pod="openshift-ingress/router-default-58d658548d-g2475" Apr 21 15:36:19.941146 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.941015 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/746f5baf-fbf8-4ff6-8fb9-0a3263bd7dc6-default-certificate\") pod \"router-default-58d658548d-g2475\" (UID: \"746f5baf-fbf8-4ff6-8fb9-0a3263bd7dc6\") " pod="openshift-ingress/router-default-58d658548d-g2475" Apr 21 15:36:19.941146 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.941038 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/746f5baf-fbf8-4ff6-8fb9-0a3263bd7dc6-metrics-certs\") pod \"router-default-58d658548d-g2475\" (UID: \"746f5baf-fbf8-4ff6-8fb9-0a3263bd7dc6\") " pod="openshift-ingress/router-default-58d658548d-g2475" Apr 21 15:36:19.941146 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.941064 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94ced3d3-36cb-4c6f-8871-7381aaf033c2-trusted-ca\") pod \"image-registry-789665c8f5-2khmb\" (UID: \"94ced3d3-36cb-4c6f-8871-7381aaf033c2\") " pod="openshift-image-registry/image-registry-789665c8f5-2khmb" Apr 21 15:36:19.941366 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.941156 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/94ced3d3-36cb-4c6f-8871-7381aaf033c2-image-registry-private-configuration\") pod \"image-registry-789665c8f5-2khmb\" (UID: \"94ced3d3-36cb-4c6f-8871-7381aaf033c2\") " pod="openshift-image-registry/image-registry-789665c8f5-2khmb" Apr 21 15:36:19.941366 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.941206 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jhmv\" (UniqueName: \"kubernetes.io/projected/746f5baf-fbf8-4ff6-8fb9-0a3263bd7dc6-kube-api-access-5jhmv\") pod \"router-default-58d658548d-g2475\" (UID: \"746f5baf-fbf8-4ff6-8fb9-0a3263bd7dc6\") " pod="openshift-ingress/router-default-58d658548d-g2475" Apr 21 15:36:19.941366 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.941253 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/94ced3d3-36cb-4c6f-8871-7381aaf033c2-registry-tls\") pod \"image-registry-789665c8f5-2khmb\" (UID: \"94ced3d3-36cb-4c6f-8871-7381aaf033c2\") " pod="openshift-image-registry/image-registry-789665c8f5-2khmb" Apr 21 15:36:19.941366 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.941278 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/94ced3d3-36cb-4c6f-8871-7381aaf033c2-registry-certificates\") pod \"image-registry-789665c8f5-2khmb\" (UID: \"94ced3d3-36cb-4c6f-8871-7381aaf033c2\") " pod="openshift-image-registry/image-registry-789665c8f5-2khmb" Apr 21 15:36:19.941366 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.941308 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/94ced3d3-36cb-4c6f-8871-7381aaf033c2-bound-sa-token\") pod \"image-registry-789665c8f5-2khmb\" (UID: \"94ced3d3-36cb-4c6f-8871-7381aaf033c2\") " pod="openshift-image-registry/image-registry-789665c8f5-2khmb" Apr 21 15:36:19.941366 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.941342 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/746f5baf-fbf8-4ff6-8fb9-0a3263bd7dc6-stats-auth\") pod \"router-default-58d658548d-g2475\" (UID: \"746f5baf-fbf8-4ff6-8fb9-0a3263bd7dc6\") " pod="openshift-ingress/router-default-58d658548d-g2475" Apr 21 15:36:19.941662 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.941370 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/94ced3d3-36cb-4c6f-8871-7381aaf033c2-ca-trust-extracted\") pod \"image-registry-789665c8f5-2khmb\" (UID: \"94ced3d3-36cb-4c6f-8871-7381aaf033c2\") " pod="openshift-image-registry/image-registry-789665c8f5-2khmb" Apr 21 15:36:19.941662 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.941415 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj4wv\" (UniqueName: \"kubernetes.io/projected/94ced3d3-36cb-4c6f-8871-7381aaf033c2-kube-api-access-rj4wv\") pod \"image-registry-789665c8f5-2khmb\" (UID: \"94ced3d3-36cb-4c6f-8871-7381aaf033c2\") " pod="openshift-image-registry/image-registry-789665c8f5-2khmb" Apr 21 15:36:19.941994 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.941968 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/746f5baf-fbf8-4ff6-8fb9-0a3263bd7dc6-service-ca-bundle\") pod \"router-default-58d658548d-g2475\" (UID: \"746f5baf-fbf8-4ff6-8fb9-0a3263bd7dc6\") " pod="openshift-ingress/router-default-58d658548d-g2475" Apr 21 15:36:19.944225 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.944156 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/746f5baf-fbf8-4ff6-8fb9-0a3263bd7dc6-stats-auth\") pod \"router-default-58d658548d-g2475\" (UID: \"746f5baf-fbf8-4ff6-8fb9-0a3263bd7dc6\") " pod="openshift-ingress/router-default-58d658548d-g2475" Apr 21 15:36:19.944225 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.944183 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/746f5baf-fbf8-4ff6-8fb9-0a3263bd7dc6-metrics-certs\") pod \"router-default-58d658548d-g2475\" (UID: \"746f5baf-fbf8-4ff6-8fb9-0a3263bd7dc6\") " pod="openshift-ingress/router-default-58d658548d-g2475" Apr 21 15:36:19.944225 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.944213 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/746f5baf-fbf8-4ff6-8fb9-0a3263bd7dc6-default-certificate\") pod \"router-default-58d658548d-g2475\" (UID: \"746f5baf-fbf8-4ff6-8fb9-0a3263bd7dc6\") " pod="openshift-ingress/router-default-58d658548d-g2475" Apr 21 15:36:19.944420 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.944259 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-hdmgw" Apr 21 15:36:19.956990 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.956958 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jhmv\" (UniqueName: \"kubernetes.io/projected/746f5baf-fbf8-4ff6-8fb9-0a3263bd7dc6-kube-api-access-5jhmv\") pod \"router-default-58d658548d-g2475\" (UID: \"746f5baf-fbf8-4ff6-8fb9-0a3263bd7dc6\") " pod="openshift-ingress/router-default-58d658548d-g2475" Apr 21 15:36:19.957220 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:19.957146 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-58d658548d-g2475" Apr 21 15:36:20.026291 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:20.026239 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-b6cbdfd85-285k8"] Apr 21 15:36:20.031268 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:36:20.031209 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dcae2ee_9f9f_4a46_a011_48c9d12f8a3c.slice/crio-e77b0094fc3a9ed957220d263431fb2de6afbe37ea36145f49add6fb8a677c43 WatchSource:0}: Error finding container e77b0094fc3a9ed957220d263431fb2de6afbe37ea36145f49add6fb8a677c43: Status 404 returned error can't find the container with id e77b0094fc3a9ed957220d263431fb2de6afbe37ea36145f49add6fb8a677c43 Apr 21 15:36:20.043360 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:20.042350 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94ced3d3-36cb-4c6f-8871-7381aaf033c2-trusted-ca\") pod \"image-registry-789665c8f5-2khmb\" (UID: \"94ced3d3-36cb-4c6f-8871-7381aaf033c2\") " pod="openshift-image-registry/image-registry-789665c8f5-2khmb" Apr 21 15:36:20.043360 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:20.042402 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/94ced3d3-36cb-4c6f-8871-7381aaf033c2-image-registry-private-configuration\") pod \"image-registry-789665c8f5-2khmb\" (UID: \"94ced3d3-36cb-4c6f-8871-7381aaf033c2\") " pod="openshift-image-registry/image-registry-789665c8f5-2khmb" Apr 21 15:36:20.043360 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:20.042464 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/94ced3d3-36cb-4c6f-8871-7381aaf033c2-registry-tls\") pod \"image-registry-789665c8f5-2khmb\" (UID: \"94ced3d3-36cb-4c6f-8871-7381aaf033c2\") " pod="openshift-image-registry/image-registry-789665c8f5-2khmb" Apr 21 15:36:20.043360 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:20.042486 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/94ced3d3-36cb-4c6f-8871-7381aaf033c2-registry-certificates\") pod \"image-registry-789665c8f5-2khmb\" (UID: \"94ced3d3-36cb-4c6f-8871-7381aaf033c2\") " pod="openshift-image-registry/image-registry-789665c8f5-2khmb" Apr 21 15:36:20.043360 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:20.042516 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/94ced3d3-36cb-4c6f-8871-7381aaf033c2-bound-sa-token\") pod \"image-registry-789665c8f5-2khmb\" (UID: \"94ced3d3-36cb-4c6f-8871-7381aaf033c2\") " pod="openshift-image-registry/image-registry-789665c8f5-2khmb" Apr 21 15:36:20.043360 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:20.042545 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/94ced3d3-36cb-4c6f-8871-7381aaf033c2-ca-trust-extracted\") pod \"image-registry-789665c8f5-2khmb\" (UID: \"94ced3d3-36cb-4c6f-8871-7381aaf033c2\") " pod="openshift-image-registry/image-registry-789665c8f5-2khmb" Apr 21 15:36:20.043360 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:20.042567 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rj4wv\" (UniqueName: \"kubernetes.io/projected/94ced3d3-36cb-4c6f-8871-7381aaf033c2-kube-api-access-rj4wv\") pod \"image-registry-789665c8f5-2khmb\" (UID: \"94ced3d3-36cb-4c6f-8871-7381aaf033c2\") " pod="openshift-image-registry/image-registry-789665c8f5-2khmb" Apr 21 15:36:20.043360 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:20.042594 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/94ced3d3-36cb-4c6f-8871-7381aaf033c2-installation-pull-secrets\") pod \"image-registry-789665c8f5-2khmb\" (UID: \"94ced3d3-36cb-4c6f-8871-7381aaf033c2\") " pod="openshift-image-registry/image-registry-789665c8f5-2khmb" Apr 21 15:36:20.043360 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:20.042672 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78d6c5c987-9wbsf"] Apr 21 15:36:20.044172 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:20.043770 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/94ced3d3-36cb-4c6f-8871-7381aaf033c2-ca-trust-extracted\") pod \"image-registry-789665c8f5-2khmb\" (UID: \"94ced3d3-36cb-4c6f-8871-7381aaf033c2\") " pod="openshift-image-registry/image-registry-789665c8f5-2khmb" Apr 21 15:36:20.045684 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:20.045656 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/94ced3d3-36cb-4c6f-8871-7381aaf033c2-installation-pull-secrets\") pod \"image-registry-789665c8f5-2khmb\" (UID: \"94ced3d3-36cb-4c6f-8871-7381aaf033c2\") " pod="openshift-image-registry/image-registry-789665c8f5-2khmb" Apr 21 15:36:20.046210 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:20.046166 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/94ced3d3-36cb-4c6f-8871-7381aaf033c2-image-registry-private-configuration\") pod \"image-registry-789665c8f5-2khmb\" (UID: \"94ced3d3-36cb-4c6f-8871-7381aaf033c2\") " pod="openshift-image-registry/image-registry-789665c8f5-2khmb" Apr 21 15:36:20.047740 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:20.047721 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/94ced3d3-36cb-4c6f-8871-7381aaf033c2-registry-certificates\") pod \"image-registry-789665c8f5-2khmb\" (UID: \"94ced3d3-36cb-4c6f-8871-7381aaf033c2\") " pod="openshift-image-registry/image-registry-789665c8f5-2khmb" Apr 21 15:36:20.050430 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:20.050402 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94ced3d3-36cb-4c6f-8871-7381aaf033c2-trusted-ca\") pod \"image-registry-789665c8f5-2khmb\" (UID: \"94ced3d3-36cb-4c6f-8871-7381aaf033c2\") " pod="openshift-image-registry/image-registry-789665c8f5-2khmb" Apr 21 15:36:20.051629 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:20.051605 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/94ced3d3-36cb-4c6f-8871-7381aaf033c2-registry-tls\") pod \"image-registry-789665c8f5-2khmb\" (UID: \"94ced3d3-36cb-4c6f-8871-7381aaf033c2\") " pod="openshift-image-registry/image-registry-789665c8f5-2khmb" Apr 21 15:36:20.052229 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:36:20.052202 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4221f170_520e_4ce0_b5f7_12915f59c78f.slice/crio-836fedbb6a4a872b5508f56dc2f18270372da769627ad41645d2f635c113f51a WatchSource:0}: Error finding container 836fedbb6a4a872b5508f56dc2f18270372da769627ad41645d2f635c113f51a: Status 404 returned error can't find the container with id 836fedbb6a4a872b5508f56dc2f18270372da769627ad41645d2f635c113f51a Apr 21 15:36:20.063731 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:20.063699 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj4wv\" (UniqueName: \"kubernetes.io/projected/94ced3d3-36cb-4c6f-8871-7381aaf033c2-kube-api-access-rj4wv\") pod \"image-registry-789665c8f5-2khmb\" (UID: \"94ced3d3-36cb-4c6f-8871-7381aaf033c2\") " pod="openshift-image-registry/image-registry-789665c8f5-2khmb" Apr 21 15:36:20.064071 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:20.064051 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/94ced3d3-36cb-4c6f-8871-7381aaf033c2-bound-sa-token\") pod \"image-registry-789665c8f5-2khmb\" (UID: \"94ced3d3-36cb-4c6f-8871-7381aaf033c2\") " pod="openshift-image-registry/image-registry-789665c8f5-2khmb" Apr 21 15:36:20.087446 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:20.087412 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-hdmgw"] Apr 21 15:36:20.090269 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:36:20.090239 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf12a02e9_41ef_4e4b_913a_8246dfcb282b.slice/crio-624b583c9530949c6a71f3d79813642c5d65ea6faee4b69b65f8b548a3e7ed37 WatchSource:0}: Error finding container 624b583c9530949c6a71f3d79813642c5d65ea6faee4b69b65f8b548a3e7ed37: Status 404 returned error can't find the container with id 624b583c9530949c6a71f3d79813642c5d65ea6faee4b69b65f8b548a3e7ed37 Apr 21 15:36:20.111597 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:20.111575 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-58d658548d-g2475"] Apr 21 15:36:20.113911 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:36:20.113881 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod746f5baf_fbf8_4ff6_8fb9_0a3263bd7dc6.slice/crio-fdf9b7e8cea2593303dffed95594fb3a150c3ed3cddf182ff0e4e06670ad4435 WatchSource:0}: Error finding container fdf9b7e8cea2593303dffed95594fb3a150c3ed3cddf182ff0e4e06670ad4435: Status 404 returned error can't find the container with id fdf9b7e8cea2593303dffed95594fb3a150c3ed3cddf182ff0e4e06670ad4435 Apr 21 15:36:20.364785 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:20.364698 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-789665c8f5-2khmb" Apr 21 15:36:20.490703 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:20.490669 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-789665c8f5-2khmb"] Apr 21 15:36:20.493686 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:36:20.493661 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94ced3d3_36cb_4c6f_8871_7381aaf033c2.slice/crio-1f292b8db959e160869fbc12c452acd4f9869abc69a8c77b7f76c280d9cbf820 WatchSource:0}: Error finding container 1f292b8db959e160869fbc12c452acd4f9869abc69a8c77b7f76c280d9cbf820: Status 404 returned error can't find the container with id 1f292b8db959e160869fbc12c452acd4f9869abc69a8c77b7f76c280d9cbf820 Apr 21 15:36:20.787862 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:20.787785 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-hdmgw" event={"ID":"f12a02e9-41ef-4e4b-913a-8246dfcb282b","Type":"ContainerStarted","Data":"624b583c9530949c6a71f3d79813642c5d65ea6faee4b69b65f8b548a3e7ed37"} Apr 21 15:36:20.790556 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:20.790494 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b6cbdfd85-285k8" event={"ID":"6dcae2ee-9f9f-4a46-a011-48c9d12f8a3c","Type":"ContainerStarted","Data":"e77b0094fc3a9ed957220d263431fb2de6afbe37ea36145f49add6fb8a677c43"} Apr 21 15:36:20.793368 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:20.792601 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-789665c8f5-2khmb" event={"ID":"94ced3d3-36cb-4c6f-8871-7381aaf033c2","Type":"ContainerStarted","Data":"88681054ea4b4eb1e2f8ddd76ce23df1168011cb585bafe57023e7831cc6caf5"} Apr 21 15:36:20.793368 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:20.792634 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-789665c8f5-2khmb" event={"ID":"94ced3d3-36cb-4c6f-8871-7381aaf033c2","Type":"ContainerStarted","Data":"1f292b8db959e160869fbc12c452acd4f9869abc69a8c77b7f76c280d9cbf820"} Apr 21 15:36:20.793368 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:20.793192 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-789665c8f5-2khmb" Apr 21 15:36:20.798876 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:20.798708 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-58d658548d-g2475" event={"ID":"746f5baf-fbf8-4ff6-8fb9-0a3263bd7dc6","Type":"ContainerStarted","Data":"31fb9f7e5e62e2c2c5a7674ee9914b7b7fa9d5710cea5c55c80500f731ebf49a"} Apr 21 15:36:20.798876 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:20.798748 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-58d658548d-g2475" event={"ID":"746f5baf-fbf8-4ff6-8fb9-0a3263bd7dc6","Type":"ContainerStarted","Data":"fdf9b7e8cea2593303dffed95594fb3a150c3ed3cddf182ff0e4e06670ad4435"} Apr 21 15:36:20.801076 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:20.801035 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78d6c5c987-9wbsf" event={"ID":"4221f170-520e-4ce0-b5f7-12915f59c78f","Type":"ContainerStarted","Data":"836fedbb6a4a872b5508f56dc2f18270372da769627ad41645d2f635c113f51a"} Apr 21 15:36:20.814869 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:20.813688 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-789665c8f5-2khmb" podStartSLOduration=1.813668694 podStartE2EDuration="1.813668694s" podCreationTimestamp="2026-04-21 15:36:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:36:20.812884645 +0000 UTC m=+57.915336264" watchObservedRunningTime="2026-04-21 15:36:20.813668694 +0000 UTC m=+57.916120323" Apr 21 15:36:20.836949 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:20.836394 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-58d658548d-g2475" podStartSLOduration=1.836372249 podStartE2EDuration="1.836372249s" podCreationTimestamp="2026-04-21 15:36:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:36:20.834158762 +0000 UTC m=+57.936610380" watchObservedRunningTime="2026-04-21 15:36:20.836372249 +0000 UTC m=+57.938823865" Apr 21 15:36:20.958615 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:20.958580 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-58d658548d-g2475" Apr 21 15:36:20.961932 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:20.961900 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-58d658548d-g2475" Apr 21 15:36:21.806346 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:21.806313 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-58d658548d-g2475" Apr 21 15:36:21.807975 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:21.807799 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-58d658548d-g2475" Apr 21 15:36:22.713890 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:22.713555 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xwc9h" Apr 21 15:36:23.773655 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:23.773622 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-sg5b7" Apr 21 15:36:24.821382 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:24.821286 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78d6c5c987-9wbsf" event={"ID":"4221f170-520e-4ce0-b5f7-12915f59c78f","Type":"ContainerStarted","Data":"b57b1e4600b2d4bc9f8d981ea1301bea2ce38d51623ea7d834ad04f517116862"} Apr 21 15:36:24.823124 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:24.823095 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b6cbdfd85-285k8" event={"ID":"6dcae2ee-9f9f-4a46-a011-48c9d12f8a3c","Type":"ContainerStarted","Data":"ff9e82f3a5e56ee09181467832c0b23360718b7f174133ae501e49670b39e210"} Apr 21 15:36:24.823420 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:24.823388 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b6cbdfd85-285k8" Apr 21 15:36:24.825914 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:24.825887 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b6cbdfd85-285k8" Apr 21 15:36:24.847725 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:24.847671 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78d6c5c987-9wbsf" podStartSLOduration=1.257796456 podStartE2EDuration="5.847654157s" podCreationTimestamp="2026-04-21 15:36:19 +0000 UTC" firstStartedPulling="2026-04-21 15:36:20.054590785 +0000 UTC m=+57.157042384" lastFinishedPulling="2026-04-21 15:36:24.644448478 +0000 UTC m=+61.746900085" observedRunningTime="2026-04-21 15:36:24.845609872 +0000 UTC m=+61.948061493" watchObservedRunningTime="2026-04-21 15:36:24.847654157 +0000 UTC m=+61.950105774" Apr 21 15:36:24.868095 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:24.868032 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b6cbdfd85-285k8" podStartSLOduration=1.246993986 podStartE2EDuration="5.86801145s" podCreationTimestamp="2026-04-21 15:36:19 +0000 UTC" firstStartedPulling="2026-04-21 15:36:20.033195441 +0000 UTC m=+57.135647041" lastFinishedPulling="2026-04-21 15:36:24.654212899 +0000 UTC m=+61.756664505" observedRunningTime="2026-04-21 15:36:24.866495524 +0000 UTC m=+61.968947146" watchObservedRunningTime="2026-04-21 15:36:24.86801145 +0000 UTC m=+61.970463072" Apr 21 15:36:25.960614 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:25.960583 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-sg5b7_60d46684-622f-4c21-be81-9f138a88507d/dns/0.log" Apr 21 15:36:26.133027 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:26.132997 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-sg5b7_60d46684-622f-4c21-be81-9f138a88507d/kube-rbac-proxy/0.log" Apr 21 15:36:26.531951 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:26.531921 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-f7s8m_5199de06-2148-4da2-80d1-c514e73d93fb/dns-node-resolver/0.log" Apr 21 15:36:27.331567 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:27.331534 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-789665c8f5-2khmb_94ced3d3-36cb-4c6f-8871-7381aaf033c2/registry/0.log" Apr 21 15:36:27.533005 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:27.532975 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-94bl8_0187917a-3b91-4173-9632-8211e2adc77e/node-ca/0.log" Apr 21 15:36:28.132307 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:28.132280 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-58d658548d-g2475_746f5baf-fbf8-4ff6-8fb9-0a3263bd7dc6/router/0.log" Apr 21 15:36:28.731895 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:28.731871 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-rhcxk_f1a9389d-bece-4871-8a89-c3af3238f617/serve-healthcheck-canary/0.log" Apr 21 15:36:29.215103 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:29.215010 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e013988-1283-4f21-89bb-0200deb14502-metrics-certs\") pod \"network-metrics-daemon-8tknf\" (UID: \"0e013988-1283-4f21-89bb-0200deb14502\") " pod="openshift-multus/network-metrics-daemon-8tknf" Apr 21 15:36:29.217754 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:29.217724 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e013988-1283-4f21-89bb-0200deb14502-metrics-certs\") pod \"network-metrics-daemon-8tknf\" (UID: \"0e013988-1283-4f21-89bb-0200deb14502\") " pod="openshift-multus/network-metrics-daemon-8tknf" Apr 21 15:36:29.457163 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:29.457120 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-dzs5m\"" Apr 21 15:36:29.465191 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:29.465111 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tknf" Apr 21 15:36:29.602169 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:29.602122 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8tknf"] Apr 21 15:36:29.606025 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:36:29.605990 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e013988_1283_4f21_89bb_0200deb14502.slice/crio-854777ebc538f3daadd00693fe42e546c111f119825347ab4d8f44c657349432 WatchSource:0}: Error finding container 854777ebc538f3daadd00693fe42e546c111f119825347ab4d8f44c657349432: Status 404 returned error can't find the container with id 854777ebc538f3daadd00693fe42e546c111f119825347ab4d8f44c657349432 Apr 21 15:36:29.838298 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:29.838201 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8tknf" event={"ID":"0e013988-1283-4f21-89bb-0200deb14502","Type":"ContainerStarted","Data":"854777ebc538f3daadd00693fe42e546c111f119825347ab4d8f44c657349432"} Apr 21 15:36:30.932832 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:30.932785 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-sgdpn"] Apr 21 15:36:30.953342 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:30.953296 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-pjqcl"] Apr 21 15:36:30.968933 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:30.968889 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-sgdpn"] Apr 21 15:36:30.969094 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:30.969035 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-pjqcl" Apr 21 15:36:30.969183 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:30.969033 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sgdpn" Apr 21 15:36:30.972663 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:30.972472 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 21 15:36:30.973415 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:30.973394 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 15:36:30.973536 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:30.973522 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 15:36:30.973690 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:30.973665 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 15:36:30.975254 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:30.974771 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-79vlq\"" Apr 21 15:36:30.975254 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:30.974976 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 15:36:30.975401 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:30.975284 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 15:36:30.975401 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:30.975286 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-mk69c\"" Apr 21 15:36:30.976044 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:30.975856 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 21 15:36:30.979081 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:30.979058 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 15:36:31.031161 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.031112 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58-node-exporter-accelerators-collector-config\") pod \"node-exporter-pjqcl\" (UID: \"ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58\") " pod="openshift-monitoring/node-exporter-pjqcl" Apr 21 15:36:31.031323 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.031175 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pjqcl\" (UID: \"ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58\") " pod="openshift-monitoring/node-exporter-pjqcl" Apr 21 15:36:31.031323 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.031206 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58-sys\") pod \"node-exporter-pjqcl\" (UID: \"ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58\") " pod="openshift-monitoring/node-exporter-pjqcl" Apr 21 15:36:31.031323 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.031236 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2bc24c10-a6e7-48e8-a178-dbc7f52c7d59-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-sgdpn\" (UID: \"2bc24c10-a6e7-48e8-a178-dbc7f52c7d59\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sgdpn" Apr 21 15:36:31.031455 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.031336 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58-node-exporter-wtmp\") pod \"node-exporter-pjqcl\" (UID: \"ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58\") " pod="openshift-monitoring/node-exporter-pjqcl" Apr 21 15:36:31.031455 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.031399 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2bc24c10-a6e7-48e8-a178-dbc7f52c7d59-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-sgdpn\" (UID: \"2bc24c10-a6e7-48e8-a178-dbc7f52c7d59\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sgdpn" Apr 21 15:36:31.031455 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.031444 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58-metrics-client-ca\") pod \"node-exporter-pjqcl\" (UID: \"ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58\") " pod="openshift-monitoring/node-exporter-pjqcl" Apr 21 15:36:31.031591 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.031485 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9d25\" (UniqueName: \"kubernetes.io/projected/2bc24c10-a6e7-48e8-a178-dbc7f52c7d59-kube-api-access-m9d25\") pod \"openshift-state-metrics-9d44df66c-sgdpn\" (UID: \"2bc24c10-a6e7-48e8-a178-dbc7f52c7d59\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sgdpn" Apr 21 15:36:31.031591 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.031508 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58-root\") pod \"node-exporter-pjqcl\" (UID: \"ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58\") " pod="openshift-monitoring/node-exporter-pjqcl" Apr 21 15:36:31.031591 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.031540 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzggf\" (UniqueName: \"kubernetes.io/projected/ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58-kube-api-access-mzggf\") pod \"node-exporter-pjqcl\" (UID: \"ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58\") " pod="openshift-monitoring/node-exporter-pjqcl" Apr 21 15:36:31.031591 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.031566 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58-node-exporter-textfile\") pod \"node-exporter-pjqcl\" (UID: \"ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58\") " pod="openshift-monitoring/node-exporter-pjqcl" Apr 21 15:36:31.031787 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.031596 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58-node-exporter-tls\") pod \"node-exporter-pjqcl\" (UID: \"ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58\") " pod="openshift-monitoring/node-exporter-pjqcl" Apr 21 15:36:31.031787 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.031627 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2bc24c10-a6e7-48e8-a178-dbc7f52c7d59-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-sgdpn\" (UID: \"2bc24c10-a6e7-48e8-a178-dbc7f52c7d59\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sgdpn" Apr 21 15:36:31.139185 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.132691 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2bc24c10-a6e7-48e8-a178-dbc7f52c7d59-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-sgdpn\" (UID: \"2bc24c10-a6e7-48e8-a178-dbc7f52c7d59\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sgdpn" Apr 21 15:36:31.139185 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.132753 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58-metrics-client-ca\") pod \"node-exporter-pjqcl\" (UID: \"ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58\") " pod="openshift-monitoring/node-exporter-pjqcl" Apr 21 15:36:31.139185 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.132787 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m9d25\" (UniqueName: \"kubernetes.io/projected/2bc24c10-a6e7-48e8-a178-dbc7f52c7d59-kube-api-access-m9d25\") pod \"openshift-state-metrics-9d44df66c-sgdpn\" (UID: \"2bc24c10-a6e7-48e8-a178-dbc7f52c7d59\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sgdpn" Apr 21 15:36:31.139185 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.132812 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58-root\") pod \"node-exporter-pjqcl\" (UID: \"ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58\") " pod="openshift-monitoring/node-exporter-pjqcl" Apr 21 15:36:31.139185 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.132842 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mzggf\" (UniqueName: \"kubernetes.io/projected/ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58-kube-api-access-mzggf\") pod \"node-exporter-pjqcl\" (UID: \"ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58\") " pod="openshift-monitoring/node-exporter-pjqcl" Apr 21 15:36:31.139185 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.132875 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58-node-exporter-textfile\") pod \"node-exporter-pjqcl\" (UID: \"ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58\") " pod="openshift-monitoring/node-exporter-pjqcl" Apr 21 15:36:31.139185 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.132905 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58-node-exporter-tls\") pod \"node-exporter-pjqcl\" (UID: \"ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58\") " pod="openshift-monitoring/node-exporter-pjqcl" Apr 21 15:36:31.139185 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.132935 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2bc24c10-a6e7-48e8-a178-dbc7f52c7d59-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-sgdpn\" (UID: \"2bc24c10-a6e7-48e8-a178-dbc7f52c7d59\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sgdpn" Apr 21 15:36:31.139185 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.132966 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58-node-exporter-accelerators-collector-config\") pod \"node-exporter-pjqcl\" (UID: \"ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58\") " pod="openshift-monitoring/node-exporter-pjqcl" Apr 21 15:36:31.139185 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.132997 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pjqcl\" (UID: \"ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58\") " pod="openshift-monitoring/node-exporter-pjqcl" Apr 21 15:36:31.139185 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.133026 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58-sys\") pod \"node-exporter-pjqcl\" (UID: \"ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58\") " pod="openshift-monitoring/node-exporter-pjqcl" Apr 21 15:36:31.139185 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.133059 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2bc24c10-a6e7-48e8-a178-dbc7f52c7d59-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-sgdpn\" (UID: \"2bc24c10-a6e7-48e8-a178-dbc7f52c7d59\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sgdpn" Apr 21 15:36:31.139185 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.133090 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58-node-exporter-wtmp\") pod \"node-exporter-pjqcl\" (UID: \"ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58\") " pod="openshift-monitoring/node-exporter-pjqcl" Apr 21 15:36:31.139185 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.133272 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58-node-exporter-wtmp\") pod \"node-exporter-pjqcl\" (UID: \"ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58\") " pod="openshift-monitoring/node-exporter-pjqcl" Apr 21 15:36:31.139185 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:36:31.133383 2569 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 21 15:36:31.139185 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:36:31.133454 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bc24c10-a6e7-48e8-a178-dbc7f52c7d59-openshift-state-metrics-tls podName:2bc24c10-a6e7-48e8-a178-dbc7f52c7d59 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:31.633433099 +0000 UTC m=+68.735884707 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/2bc24c10-a6e7-48e8-a178-dbc7f52c7d59-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-sgdpn" (UID: "2bc24c10-a6e7-48e8-a178-dbc7f52c7d59") : secret "openshift-state-metrics-tls" not found Apr 21 15:36:31.140117 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.134219 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58-metrics-client-ca\") pod \"node-exporter-pjqcl\" (UID: \"ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58\") " pod="openshift-monitoring/node-exporter-pjqcl" Apr 21 15:36:31.140117 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.134503 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58-root\") pod \"node-exporter-pjqcl\" (UID: \"ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58\") " pod="openshift-monitoring/node-exporter-pjqcl" Apr 21 15:36:31.140117 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.134950 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58-node-exporter-textfile\") pod \"node-exporter-pjqcl\" (UID: \"ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58\") " pod="openshift-monitoring/node-exporter-pjqcl" Apr 21 15:36:31.140117 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:36:31.135035 2569 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 21 15:36:31.140117 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:36:31.135079 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58-node-exporter-tls podName:ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:31.6350645 +0000 UTC m=+68.737516105 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58-node-exporter-tls") pod "node-exporter-pjqcl" (UID: "ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58") : secret "node-exporter-tls" not found Apr 21 15:36:31.140117 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.135469 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58-sys\") pod \"node-exporter-pjqcl\" (UID: \"ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58\") " pod="openshift-monitoring/node-exporter-pjqcl" Apr 21 15:36:31.140117 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.135985 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58-node-exporter-accelerators-collector-config\") pod \"node-exporter-pjqcl\" (UID: \"ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58\") " pod="openshift-monitoring/node-exporter-pjqcl" Apr 21 15:36:31.140117 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.136086 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2bc24c10-a6e7-48e8-a178-dbc7f52c7d59-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-sgdpn\" (UID: \"2bc24c10-a6e7-48e8-a178-dbc7f52c7d59\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sgdpn" Apr 21 15:36:31.140117 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.138289 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2bc24c10-a6e7-48e8-a178-dbc7f52c7d59-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-sgdpn\" (UID: \"2bc24c10-a6e7-48e8-a178-dbc7f52c7d59\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sgdpn" Apr 21 15:36:31.151354 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.151280 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pjqcl\" (UID: \"ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58\") " pod="openshift-monitoring/node-exporter-pjqcl" Apr 21 15:36:31.155678 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.152295 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzggf\" (UniqueName: \"kubernetes.io/projected/ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58-kube-api-access-mzggf\") pod \"node-exporter-pjqcl\" (UID: \"ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58\") " pod="openshift-monitoring/node-exporter-pjqcl" Apr 21 15:36:31.157548 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.157503 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9d25\" (UniqueName: \"kubernetes.io/projected/2bc24c10-a6e7-48e8-a178-dbc7f52c7d59-kube-api-access-m9d25\") pod \"openshift-state-metrics-9d44df66c-sgdpn\" (UID: \"2bc24c10-a6e7-48e8-a178-dbc7f52c7d59\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sgdpn" Apr 21 15:36:31.641962 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.641921 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2bc24c10-a6e7-48e8-a178-dbc7f52c7d59-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-sgdpn\" (UID: \"2bc24c10-a6e7-48e8-a178-dbc7f52c7d59\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sgdpn" Apr 21 15:36:31.642174 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.642010 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58-node-exporter-tls\") pod \"node-exporter-pjqcl\" (UID: \"ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58\") " pod="openshift-monitoring/node-exporter-pjqcl" Apr 21 15:36:31.645395 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.645344 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58-node-exporter-tls\") pod \"node-exporter-pjqcl\" (UID: \"ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58\") " pod="openshift-monitoring/node-exporter-pjqcl" Apr 21 15:36:31.645555 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.645531 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2bc24c10-a6e7-48e8-a178-dbc7f52c7d59-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-sgdpn\" (UID: \"2bc24c10-a6e7-48e8-a178-dbc7f52c7d59\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sgdpn" Apr 21 15:36:31.813709 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.813248 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-jscgp"] Apr 21 15:36:31.832188 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.832002 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-jscgp"] Apr 21 15:36:31.832188 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.832185 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-jscgp" Apr 21 15:36:31.842690 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.840455 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-dcdll\"" Apr 21 15:36:31.842690 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.840708 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 15:36:31.842690 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.840937 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 15:36:31.842690 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.841222 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 15:36:31.842690 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.841411 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 15:36:31.856713 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.856228 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8tknf" event={"ID":"0e013988-1283-4f21-89bb-0200deb14502","Type":"ContainerStarted","Data":"4336ff982a33e75569a1230e08613e6afbabea96e1adf7513013d5bb7f6f03ec"} Apr 21 15:36:31.856713 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.856266 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8tknf" event={"ID":"0e013988-1283-4f21-89bb-0200deb14502","Type":"ContainerStarted","Data":"6772d60e75cced180e72d541d2f59cdc0d95dcf10baeb3f1cc87207598a01c0a"} Apr 21 15:36:31.884382 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.884151 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-8tknf" podStartSLOduration=67.484648089 podStartE2EDuration="1m8.884115317s" podCreationTimestamp="2026-04-21 15:35:23 +0000 UTC" firstStartedPulling="2026-04-21 15:36:29.608272784 +0000 UTC m=+66.710724387" lastFinishedPulling="2026-04-21 15:36:31.007740002 +0000 UTC m=+68.110191615" observedRunningTime="2026-04-21 15:36:31.882675129 +0000 UTC m=+68.985126751" watchObservedRunningTime="2026-04-21 15:36:31.884115317 +0000 UTC m=+68.986566939" Apr 21 15:36:31.885266 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.885243 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-pjqcl" Apr 21 15:36:31.892423 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.892358 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sgdpn" Apr 21 15:36:31.947098 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.947062 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a429572e-f646-4624-aaee-489752ccaffb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jscgp\" (UID: \"a429572e-f646-4624-aaee-489752ccaffb\") " pod="openshift-insights/insights-runtime-extractor-jscgp" Apr 21 15:36:31.947543 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.947176 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a429572e-f646-4624-aaee-489752ccaffb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jscgp\" (UID: \"a429572e-f646-4624-aaee-489752ccaffb\") " pod="openshift-insights/insights-runtime-extractor-jscgp" Apr 21 15:36:31.947543 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.947211 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a429572e-f646-4624-aaee-489752ccaffb-crio-socket\") pod \"insights-runtime-extractor-jscgp\" (UID: \"a429572e-f646-4624-aaee-489752ccaffb\") " pod="openshift-insights/insights-runtime-extractor-jscgp" Apr 21 15:36:31.947543 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.947246 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95t5v\" (UniqueName: \"kubernetes.io/projected/a429572e-f646-4624-aaee-489752ccaffb-kube-api-access-95t5v\") pod \"insights-runtime-extractor-jscgp\" (UID: \"a429572e-f646-4624-aaee-489752ccaffb\") " pod="openshift-insights/insights-runtime-extractor-jscgp" Apr 21 15:36:31.947543 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:31.947276 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a429572e-f646-4624-aaee-489752ccaffb-data-volume\") pod \"insights-runtime-extractor-jscgp\" (UID: \"a429572e-f646-4624-aaee-489752ccaffb\") " pod="openshift-insights/insights-runtime-extractor-jscgp" Apr 21 15:36:32.048671 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:32.048630 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a429572e-f646-4624-aaee-489752ccaffb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jscgp\" (UID: \"a429572e-f646-4624-aaee-489752ccaffb\") " pod="openshift-insights/insights-runtime-extractor-jscgp" Apr 21 15:36:32.048844 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:32.048683 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a429572e-f646-4624-aaee-489752ccaffb-crio-socket\") pod \"insights-runtime-extractor-jscgp\" (UID: \"a429572e-f646-4624-aaee-489752ccaffb\") " pod="openshift-insights/insights-runtime-extractor-jscgp" Apr 21 15:36:32.048844 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:32.048719 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-95t5v\" (UniqueName: \"kubernetes.io/projected/a429572e-f646-4624-aaee-489752ccaffb-kube-api-access-95t5v\") pod \"insights-runtime-extractor-jscgp\" (UID: \"a429572e-f646-4624-aaee-489752ccaffb\") " pod="openshift-insights/insights-runtime-extractor-jscgp" Apr 21 15:36:32.048844 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:32.048752 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a429572e-f646-4624-aaee-489752ccaffb-data-volume\") pod \"insights-runtime-extractor-jscgp\" (UID: \"a429572e-f646-4624-aaee-489752ccaffb\") " pod="openshift-insights/insights-runtime-extractor-jscgp" Apr 21 15:36:32.048844 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:36:32.048784 2569 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 21 15:36:32.048844 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:32.048788 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a429572e-f646-4624-aaee-489752ccaffb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jscgp\" (UID: \"a429572e-f646-4624-aaee-489752ccaffb\") " pod="openshift-insights/insights-runtime-extractor-jscgp" Apr 21 15:36:32.049118 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:36:32.048862 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a429572e-f646-4624-aaee-489752ccaffb-insights-runtime-extractor-tls podName:a429572e-f646-4624-aaee-489752ccaffb nodeName:}" failed. No retries permitted until 2026-04-21 15:36:32.548830731 +0000 UTC m=+69.651282333 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/a429572e-f646-4624-aaee-489752ccaffb-insights-runtime-extractor-tls") pod "insights-runtime-extractor-jscgp" (UID: "a429572e-f646-4624-aaee-489752ccaffb") : secret "insights-runtime-extractor-tls" not found Apr 21 15:36:32.049412 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:32.049379 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a429572e-f646-4624-aaee-489752ccaffb-crio-socket\") pod \"insights-runtime-extractor-jscgp\" (UID: \"a429572e-f646-4624-aaee-489752ccaffb\") " pod="openshift-insights/insights-runtime-extractor-jscgp" Apr 21 15:36:32.049412 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:32.049405 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a429572e-f646-4624-aaee-489752ccaffb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jscgp\" (UID: \"a429572e-f646-4624-aaee-489752ccaffb\") " pod="openshift-insights/insights-runtime-extractor-jscgp" Apr 21 15:36:32.049637 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:32.049616 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a429572e-f646-4624-aaee-489752ccaffb-data-volume\") pod \"insights-runtime-extractor-jscgp\" (UID: \"a429572e-f646-4624-aaee-489752ccaffb\") " pod="openshift-insights/insights-runtime-extractor-jscgp" Apr 21 15:36:32.062316 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:32.062287 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-95t5v\" (UniqueName: \"kubernetes.io/projected/a429572e-f646-4624-aaee-489752ccaffb-kube-api-access-95t5v\") pod \"insights-runtime-extractor-jscgp\" (UID: \"a429572e-f646-4624-aaee-489752ccaffb\") " pod="openshift-insights/insights-runtime-extractor-jscgp" Apr 21 15:36:32.063231 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:32.063203 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-sgdpn"] Apr 21 15:36:32.069394 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:36:32.069289 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bc24c10_a6e7_48e8_a178_dbc7f52c7d59.slice/crio-d7214f7fcc85e3dbf8923450b50ea9f630483b71a2d5d1aa065b1e623441bf40 WatchSource:0}: Error finding container d7214f7fcc85e3dbf8923450b50ea9f630483b71a2d5d1aa065b1e623441bf40: Status 404 returned error can't find the container with id d7214f7fcc85e3dbf8923450b50ea9f630483b71a2d5d1aa065b1e623441bf40 Apr 21 15:36:32.553505 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:32.553455 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a429572e-f646-4624-aaee-489752ccaffb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jscgp\" (UID: \"a429572e-f646-4624-aaee-489752ccaffb\") " pod="openshift-insights/insights-runtime-extractor-jscgp" Apr 21 15:36:32.556318 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:32.556287 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a429572e-f646-4624-aaee-489752ccaffb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jscgp\" (UID: \"a429572e-f646-4624-aaee-489752ccaffb\") " pod="openshift-insights/insights-runtime-extractor-jscgp" Apr 21 15:36:32.738922 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:32.738886 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-4vnh5" Apr 21 15:36:32.765266 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:32.764901 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-jscgp" Apr 21 15:36:32.862561 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:32.862525 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sgdpn" event={"ID":"2bc24c10-a6e7-48e8-a178-dbc7f52c7d59","Type":"ContainerStarted","Data":"fe8f2e3da6ba8f2c616420ba59358aab17362ce9f8de1e60d0d5af894ae37407"} Apr 21 15:36:32.862712 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:32.862569 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sgdpn" event={"ID":"2bc24c10-a6e7-48e8-a178-dbc7f52c7d59","Type":"ContainerStarted","Data":"c6ce37bb7d4566398ea5cce596f7f8acab0c10bd43525fa95c2a4c353e0b2c8e"} Apr 21 15:36:32.862712 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:32.862586 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sgdpn" event={"ID":"2bc24c10-a6e7-48e8-a178-dbc7f52c7d59","Type":"ContainerStarted","Data":"d7214f7fcc85e3dbf8923450b50ea9f630483b71a2d5d1aa065b1e623441bf40"} Apr 21 15:36:32.864106 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:32.863926 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pjqcl" event={"ID":"ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58","Type":"ContainerStarted","Data":"b3f833cd5b56774eb7843cab1abd9ecf764f7ace4b05b01afad2c7243fa8afdb"} Apr 21 15:36:32.864106 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:32.863963 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pjqcl" event={"ID":"ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58","Type":"ContainerStarted","Data":"5f6c2652a02342c9c749ceba1c7b4cdb84d5467a428547e100d8769b15112023"} Apr 21 15:36:32.911265 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:32.911205 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-jscgp"] Apr 21 15:36:35.309230 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.309189 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-8b68b5dff-cfnl7"] Apr 21 15:36:35.312494 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.312466 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-8b68b5dff-cfnl7" Apr 21 15:36:35.316113 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.316083 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 21 15:36:35.316265 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.316119 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 21 15:36:35.316265 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.316086 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-cm9cnt2hgckhu\"" Apr 21 15:36:35.316265 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.316161 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 21 15:36:35.316265 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.316093 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 21 15:36:35.316265 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.316085 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-bx8nt\"" Apr 21 15:36:35.321054 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.320945 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-8b68b5dff-cfnl7"] Apr 21 15:36:35.378085 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.378048 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/23a4ea67-b844-43a1-bd74-fb2d6787d688-audit-log\") pod \"metrics-server-8b68b5dff-cfnl7\" (UID: \"23a4ea67-b844-43a1-bd74-fb2d6787d688\") " pod="openshift-monitoring/metrics-server-8b68b5dff-cfnl7" Apr 21 15:36:35.378322 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.378105 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/23a4ea67-b844-43a1-bd74-fb2d6787d688-secret-metrics-server-client-certs\") pod \"metrics-server-8b68b5dff-cfnl7\" (UID: \"23a4ea67-b844-43a1-bd74-fb2d6787d688\") " pod="openshift-monitoring/metrics-server-8b68b5dff-cfnl7" Apr 21 15:36:35.378322 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.378217 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/23a4ea67-b844-43a1-bd74-fb2d6787d688-secret-metrics-server-tls\") pod \"metrics-server-8b68b5dff-cfnl7\" (UID: \"23a4ea67-b844-43a1-bd74-fb2d6787d688\") " pod="openshift-monitoring/metrics-server-8b68b5dff-cfnl7" Apr 21 15:36:35.378322 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.378272 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/23a4ea67-b844-43a1-bd74-fb2d6787d688-metrics-server-audit-profiles\") pod \"metrics-server-8b68b5dff-cfnl7\" (UID: \"23a4ea67-b844-43a1-bd74-fb2d6787d688\") " pod="openshift-monitoring/metrics-server-8b68b5dff-cfnl7" Apr 21 15:36:35.378474 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.378331 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23a4ea67-b844-43a1-bd74-fb2d6787d688-client-ca-bundle\") pod \"metrics-server-8b68b5dff-cfnl7\" (UID: \"23a4ea67-b844-43a1-bd74-fb2d6787d688\") " pod="openshift-monitoring/metrics-server-8b68b5dff-cfnl7" Apr 21 15:36:35.378474 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.378379 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23a4ea67-b844-43a1-bd74-fb2d6787d688-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-8b68b5dff-cfnl7\" (UID: \"23a4ea67-b844-43a1-bd74-fb2d6787d688\") " pod="openshift-monitoring/metrics-server-8b68b5dff-cfnl7" Apr 21 15:36:35.378474 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.378399 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pfm9\" (UniqueName: \"kubernetes.io/projected/23a4ea67-b844-43a1-bd74-fb2d6787d688-kube-api-access-5pfm9\") pod \"metrics-server-8b68b5dff-cfnl7\" (UID: \"23a4ea67-b844-43a1-bd74-fb2d6787d688\") " pod="openshift-monitoring/metrics-server-8b68b5dff-cfnl7" Apr 21 15:36:35.478835 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.478794 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/23a4ea67-b844-43a1-bd74-fb2d6787d688-audit-log\") pod \"metrics-server-8b68b5dff-cfnl7\" (UID: \"23a4ea67-b844-43a1-bd74-fb2d6787d688\") " pod="openshift-monitoring/metrics-server-8b68b5dff-cfnl7" Apr 21 15:36:35.479024 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.478846 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/23a4ea67-b844-43a1-bd74-fb2d6787d688-secret-metrics-server-client-certs\") pod \"metrics-server-8b68b5dff-cfnl7\" (UID: \"23a4ea67-b844-43a1-bd74-fb2d6787d688\") " pod="openshift-monitoring/metrics-server-8b68b5dff-cfnl7" Apr 21 15:36:35.479024 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.478907 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/23a4ea67-b844-43a1-bd74-fb2d6787d688-secret-metrics-server-tls\") pod \"metrics-server-8b68b5dff-cfnl7\" (UID: \"23a4ea67-b844-43a1-bd74-fb2d6787d688\") " pod="openshift-monitoring/metrics-server-8b68b5dff-cfnl7" Apr 21 15:36:35.479024 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.478968 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/23a4ea67-b844-43a1-bd74-fb2d6787d688-metrics-server-audit-profiles\") pod \"metrics-server-8b68b5dff-cfnl7\" (UID: \"23a4ea67-b844-43a1-bd74-fb2d6787d688\") " pod="openshift-monitoring/metrics-server-8b68b5dff-cfnl7" Apr 21 15:36:35.479024 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.479015 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23a4ea67-b844-43a1-bd74-fb2d6787d688-client-ca-bundle\") pod \"metrics-server-8b68b5dff-cfnl7\" (UID: \"23a4ea67-b844-43a1-bd74-fb2d6787d688\") " pod="openshift-monitoring/metrics-server-8b68b5dff-cfnl7" Apr 21 15:36:35.479277 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.479063 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23a4ea67-b844-43a1-bd74-fb2d6787d688-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-8b68b5dff-cfnl7\" (UID: \"23a4ea67-b844-43a1-bd74-fb2d6787d688\") " pod="openshift-monitoring/metrics-server-8b68b5dff-cfnl7" Apr 21 15:36:35.479277 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.479083 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5pfm9\" (UniqueName: \"kubernetes.io/projected/23a4ea67-b844-43a1-bd74-fb2d6787d688-kube-api-access-5pfm9\") pod \"metrics-server-8b68b5dff-cfnl7\" (UID: \"23a4ea67-b844-43a1-bd74-fb2d6787d688\") " pod="openshift-monitoring/metrics-server-8b68b5dff-cfnl7" Apr 21 15:36:35.479380 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.479334 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/23a4ea67-b844-43a1-bd74-fb2d6787d688-audit-log\") pod \"metrics-server-8b68b5dff-cfnl7\" (UID: \"23a4ea67-b844-43a1-bd74-fb2d6787d688\") " pod="openshift-monitoring/metrics-server-8b68b5dff-cfnl7" Apr 21 15:36:35.479904 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.479855 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23a4ea67-b844-43a1-bd74-fb2d6787d688-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-8b68b5dff-cfnl7\" (UID: \"23a4ea67-b844-43a1-bd74-fb2d6787d688\") " pod="openshift-monitoring/metrics-server-8b68b5dff-cfnl7" Apr 21 15:36:35.480143 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.480108 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/23a4ea67-b844-43a1-bd74-fb2d6787d688-metrics-server-audit-profiles\") pod \"metrics-server-8b68b5dff-cfnl7\" (UID: \"23a4ea67-b844-43a1-bd74-fb2d6787d688\") " pod="openshift-monitoring/metrics-server-8b68b5dff-cfnl7" Apr 21 15:36:35.481932 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.481910 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/23a4ea67-b844-43a1-bd74-fb2d6787d688-secret-metrics-server-tls\") pod \"metrics-server-8b68b5dff-cfnl7\" (UID: \"23a4ea67-b844-43a1-bd74-fb2d6787d688\") " pod="openshift-monitoring/metrics-server-8b68b5dff-cfnl7" Apr 21 15:36:35.482048 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.482022 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23a4ea67-b844-43a1-bd74-fb2d6787d688-client-ca-bundle\") pod \"metrics-server-8b68b5dff-cfnl7\" (UID: \"23a4ea67-b844-43a1-bd74-fb2d6787d688\") " pod="openshift-monitoring/metrics-server-8b68b5dff-cfnl7" Apr 21 15:36:35.482104 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.482082 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/23a4ea67-b844-43a1-bd74-fb2d6787d688-secret-metrics-server-client-certs\") pod \"metrics-server-8b68b5dff-cfnl7\" (UID: \"23a4ea67-b844-43a1-bd74-fb2d6787d688\") " pod="openshift-monitoring/metrics-server-8b68b5dff-cfnl7" Apr 21 15:36:35.488640 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.488596 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pfm9\" (UniqueName: \"kubernetes.io/projected/23a4ea67-b844-43a1-bd74-fb2d6787d688-kube-api-access-5pfm9\") pod \"metrics-server-8b68b5dff-cfnl7\" (UID: \"23a4ea67-b844-43a1-bd74-fb2d6787d688\") " pod="openshift-monitoring/metrics-server-8b68b5dff-cfnl7" Apr 21 15:36:35.624729 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.624639 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-8b68b5dff-cfnl7" Apr 21 15:36:35.646972 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.646938 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-wtsst"] Apr 21 15:36:35.651710 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.651687 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wtsst" Apr 21 15:36:35.654171 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.654150 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 21 15:36:35.654281 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.654196 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-t29ns\"" Apr 21 15:36:35.659255 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.659091 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-wtsst"] Apr 21 15:36:35.680712 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.680681 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0f967e80-450d-4360-9a92-a5407235a3a9-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-wtsst\" (UID: \"0f967e80-450d-4360-9a92-a5407235a3a9\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wtsst" Apr 21 15:36:35.782729 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.781765 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0f967e80-450d-4360-9a92-a5407235a3a9-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-wtsst\" (UID: \"0f967e80-450d-4360-9a92-a5407235a3a9\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wtsst" Apr 21 15:36:35.782729 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:36:35.781913 2569 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 21 15:36:35.783022 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:36:35.782998 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f967e80-450d-4360-9a92-a5407235a3a9-monitoring-plugin-cert podName:0f967e80-450d-4360-9a92-a5407235a3a9 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:36.281973505 +0000 UTC m=+73.384425108 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/0f967e80-450d-4360-9a92-a5407235a3a9-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-wtsst" (UID: "0f967e80-450d-4360-9a92-a5407235a3a9") : secret "monitoring-plugin-cert" not found Apr 21 15:36:35.783431 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.783406 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9cb554df-hnj7z"] Apr 21 15:36:35.786816 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.786757 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9cb554df-hnj7z" Apr 21 15:36:35.789458 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.789167 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 21 15:36:35.789458 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.789193 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 21 15:36:35.789458 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.789168 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-46h4z\"" Apr 21 15:36:35.789458 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.789315 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 21 15:36:35.789725 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.789554 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 21 15:36:35.790249 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.790215 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 21 15:36:35.794773 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.794750 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 21 15:36:35.799722 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.799696 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9cb554df-hnj7z"] Apr 21 15:36:35.882494 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.882412 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/31fbf463-1da8-4893-9dd4-c1b4403a1294-console-config\") pod \"console-f9cb554df-hnj7z\" (UID: \"31fbf463-1da8-4893-9dd4-c1b4403a1294\") " pod="openshift-console/console-f9cb554df-hnj7z" Apr 21 15:36:35.882494 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.882456 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5khgr\" (UniqueName: \"kubernetes.io/projected/31fbf463-1da8-4893-9dd4-c1b4403a1294-kube-api-access-5khgr\") pod \"console-f9cb554df-hnj7z\" (UID: \"31fbf463-1da8-4893-9dd4-c1b4403a1294\") " pod="openshift-console/console-f9cb554df-hnj7z" Apr 21 15:36:35.882494 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.882487 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31fbf463-1da8-4893-9dd4-c1b4403a1294-trusted-ca-bundle\") pod \"console-f9cb554df-hnj7z\" (UID: \"31fbf463-1da8-4893-9dd4-c1b4403a1294\") " pod="openshift-console/console-f9cb554df-hnj7z" Apr 21 15:36:35.882732 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.882626 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/31fbf463-1da8-4893-9dd4-c1b4403a1294-service-ca\") pod \"console-f9cb554df-hnj7z\" (UID: \"31fbf463-1da8-4893-9dd4-c1b4403a1294\") " pod="openshift-console/console-f9cb554df-hnj7z" Apr 21 15:36:35.882732 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.882688 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/31fbf463-1da8-4893-9dd4-c1b4403a1294-console-serving-cert\") pod \"console-f9cb554df-hnj7z\" (UID: \"31fbf463-1da8-4893-9dd4-c1b4403a1294\") " pod="openshift-console/console-f9cb554df-hnj7z" Apr 21 15:36:35.882732 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.882722 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/31fbf463-1da8-4893-9dd4-c1b4403a1294-console-oauth-config\") pod \"console-f9cb554df-hnj7z\" (UID: \"31fbf463-1da8-4893-9dd4-c1b4403a1294\") " pod="openshift-console/console-f9cb554df-hnj7z" Apr 21 15:36:35.882852 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.882822 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/31fbf463-1da8-4893-9dd4-c1b4403a1294-oauth-serving-cert\") pod \"console-f9cb554df-hnj7z\" (UID: \"31fbf463-1da8-4893-9dd4-c1b4403a1294\") " pod="openshift-console/console-f9cb554df-hnj7z" Apr 21 15:36:35.984226 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.984190 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/31fbf463-1da8-4893-9dd4-c1b4403a1294-service-ca\") pod \"console-f9cb554df-hnj7z\" (UID: \"31fbf463-1da8-4893-9dd4-c1b4403a1294\") " pod="openshift-console/console-f9cb554df-hnj7z" Apr 21 15:36:35.984411 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.984243 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/31fbf463-1da8-4893-9dd4-c1b4403a1294-console-serving-cert\") pod \"console-f9cb554df-hnj7z\" (UID: \"31fbf463-1da8-4893-9dd4-c1b4403a1294\") " pod="openshift-console/console-f9cb554df-hnj7z" Apr 21 15:36:35.984411 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.984275 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/31fbf463-1da8-4893-9dd4-c1b4403a1294-console-oauth-config\") pod \"console-f9cb554df-hnj7z\" (UID: \"31fbf463-1da8-4893-9dd4-c1b4403a1294\") " pod="openshift-console/console-f9cb554df-hnj7z" Apr 21 15:36:35.984411 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.984305 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/31fbf463-1da8-4893-9dd4-c1b4403a1294-oauth-serving-cert\") pod \"console-f9cb554df-hnj7z\" (UID: \"31fbf463-1da8-4893-9dd4-c1b4403a1294\") " pod="openshift-console/console-f9cb554df-hnj7z" Apr 21 15:36:35.984411 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.984346 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/31fbf463-1da8-4893-9dd4-c1b4403a1294-console-config\") pod \"console-f9cb554df-hnj7z\" (UID: \"31fbf463-1da8-4893-9dd4-c1b4403a1294\") " pod="openshift-console/console-f9cb554df-hnj7z" Apr 21 15:36:35.984411 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.984380 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5khgr\" (UniqueName: \"kubernetes.io/projected/31fbf463-1da8-4893-9dd4-c1b4403a1294-kube-api-access-5khgr\") pod \"console-f9cb554df-hnj7z\" (UID: \"31fbf463-1da8-4893-9dd4-c1b4403a1294\") " pod="openshift-console/console-f9cb554df-hnj7z" Apr 21 15:36:35.984411 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.984403 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31fbf463-1da8-4893-9dd4-c1b4403a1294-trusted-ca-bundle\") pod \"console-f9cb554df-hnj7z\" (UID: \"31fbf463-1da8-4893-9dd4-c1b4403a1294\") " pod="openshift-console/console-f9cb554df-hnj7z" Apr 21 15:36:35.985058 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.985022 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/31fbf463-1da8-4893-9dd4-c1b4403a1294-service-ca\") pod \"console-f9cb554df-hnj7z\" (UID: \"31fbf463-1da8-4893-9dd4-c1b4403a1294\") " pod="openshift-console/console-f9cb554df-hnj7z" Apr 21 15:36:35.985263 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.985237 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31fbf463-1da8-4893-9dd4-c1b4403a1294-trusted-ca-bundle\") pod \"console-f9cb554df-hnj7z\" (UID: \"31fbf463-1da8-4893-9dd4-c1b4403a1294\") " pod="openshift-console/console-f9cb554df-hnj7z" Apr 21 15:36:35.985437 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.985417 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/31fbf463-1da8-4893-9dd4-c1b4403a1294-console-config\") pod \"console-f9cb554df-hnj7z\" (UID: \"31fbf463-1da8-4893-9dd4-c1b4403a1294\") " pod="openshift-console/console-f9cb554df-hnj7z" Apr 21 15:36:35.985605 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.985549 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/31fbf463-1da8-4893-9dd4-c1b4403a1294-oauth-serving-cert\") pod \"console-f9cb554df-hnj7z\" (UID: \"31fbf463-1da8-4893-9dd4-c1b4403a1294\") " pod="openshift-console/console-f9cb554df-hnj7z" Apr 21 15:36:35.987973 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.987942 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/31fbf463-1da8-4893-9dd4-c1b4403a1294-console-oauth-config\") pod \"console-f9cb554df-hnj7z\" (UID: \"31fbf463-1da8-4893-9dd4-c1b4403a1294\") " pod="openshift-console/console-f9cb554df-hnj7z" Apr 21 15:36:35.988635 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.988609 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/31fbf463-1da8-4893-9dd4-c1b4403a1294-console-serving-cert\") pod \"console-f9cb554df-hnj7z\" (UID: \"31fbf463-1da8-4893-9dd4-c1b4403a1294\") " pod="openshift-console/console-f9cb554df-hnj7z" Apr 21 15:36:35.997344 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:35.997319 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5khgr\" (UniqueName: \"kubernetes.io/projected/31fbf463-1da8-4893-9dd4-c1b4403a1294-kube-api-access-5khgr\") pod \"console-f9cb554df-hnj7z\" (UID: \"31fbf463-1da8-4893-9dd4-c1b4403a1294\") " pod="openshift-console/console-f9cb554df-hnj7z" Apr 21 15:36:36.099042 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:36.099000 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9cb554df-hnj7z" Apr 21 15:36:36.286432 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:36.286392 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0f967e80-450d-4360-9a92-a5407235a3a9-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-wtsst\" (UID: \"0f967e80-450d-4360-9a92-a5407235a3a9\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wtsst" Apr 21 15:36:36.286617 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:36:36.286583 2569 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 21 15:36:36.286688 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:36:36.286650 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f967e80-450d-4360-9a92-a5407235a3a9-monitoring-plugin-cert podName:0f967e80-450d-4360-9a92-a5407235a3a9 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:37.286631119 +0000 UTC m=+74.389082725 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/0f967e80-450d-4360-9a92-a5407235a3a9-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-wtsst" (UID: "0f967e80-450d-4360-9a92-a5407235a3a9") : secret "monitoring-plugin-cert" not found Apr 21 15:36:37.296715 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:37.296679 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0f967e80-450d-4360-9a92-a5407235a3a9-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-wtsst\" (UID: \"0f967e80-450d-4360-9a92-a5407235a3a9\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wtsst" Apr 21 15:36:37.299446 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:37.299419 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0f967e80-450d-4360-9a92-a5407235a3a9-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-wtsst\" (UID: \"0f967e80-450d-4360-9a92-a5407235a3a9\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wtsst" Apr 21 15:36:37.463574 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:37.463537 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wtsst" Apr 21 15:36:39.757808 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:36:39.757778 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda429572e_f646_4624_aaee_489752ccaffb.slice/crio-e22745cdafae9177cb21053ea40664e27d7fb1f8f9a826ec9bdae007e5b1bc2c WatchSource:0}: Error finding container e22745cdafae9177cb21053ea40664e27d7fb1f8f9a826ec9bdae007e5b1bc2c: Status 404 returned error can't find the container with id e22745cdafae9177cb21053ea40664e27d7fb1f8f9a826ec9bdae007e5b1bc2c Apr 21 15:36:39.885271 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:39.885238 2569 generic.go:358] "Generic (PLEG): container finished" podID="ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58" containerID="b3f833cd5b56774eb7843cab1abd9ecf764f7ace4b05b01afad2c7243fa8afdb" exitCode=0 Apr 21 15:36:39.885446 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:39.885324 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pjqcl" event={"ID":"ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58","Type":"ContainerDied","Data":"b3f833cd5b56774eb7843cab1abd9ecf764f7ace4b05b01afad2c7243fa8afdb"} Apr 21 15:36:39.886558 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:39.886528 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jscgp" event={"ID":"a429572e-f646-4624-aaee-489752ccaffb","Type":"ContainerStarted","Data":"e22745cdafae9177cb21053ea40664e27d7fb1f8f9a826ec9bdae007e5b1bc2c"} Apr 21 15:36:40.211601 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:40.211552 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-8b68b5dff-cfnl7"] Apr 21 15:36:40.217156 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:36:40.217070 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23a4ea67_b844_43a1_bd74_fb2d6787d688.slice/crio-dcf43b4c74d5a6a99709127d5eee5d82d633d48012483946f13ba57c4fec87ff WatchSource:0}: Error finding container dcf43b4c74d5a6a99709127d5eee5d82d633d48012483946f13ba57c4fec87ff: Status 404 returned error can't find the container with id dcf43b4c74d5a6a99709127d5eee5d82d633d48012483946f13ba57c4fec87ff Apr 21 15:36:40.424007 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:40.423949 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-wtsst"] Apr 21 15:36:40.426691 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:40.426644 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9cb554df-hnj7z"] Apr 21 15:36:40.427474 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:36:40.427440 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f967e80_450d_4360_9a92_a5407235a3a9.slice/crio-2d45ba39523dd8d8c95038d465f83b8b2f2ea425f275055b23bc1f0b72c87f30 WatchSource:0}: Error finding container 2d45ba39523dd8d8c95038d465f83b8b2f2ea425f275055b23bc1f0b72c87f30: Status 404 returned error can't find the container with id 2d45ba39523dd8d8c95038d465f83b8b2f2ea425f275055b23bc1f0b72c87f30 Apr 21 15:36:40.431031 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:36:40.431002 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31fbf463_1da8_4893_9dd4_c1b4403a1294.slice/crio-72865c54c8d90d99eb3029534d73d27437dfa2d4dc16a36077ee382fbe76eb46 WatchSource:0}: Error finding container 72865c54c8d90d99eb3029534d73d27437dfa2d4dc16a36077ee382fbe76eb46: Status 404 returned error can't find the container with id 72865c54c8d90d99eb3029534d73d27437dfa2d4dc16a36077ee382fbe76eb46 Apr 21 15:36:40.675279 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:40.675199 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9cb554df-hnj7z"] Apr 21 15:36:40.893868 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:40.893825 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sgdpn" event={"ID":"2bc24c10-a6e7-48e8-a178-dbc7f52c7d59","Type":"ContainerStarted","Data":"82ca6618be3c1cb987362b6537b223ed05df7cc64680619d2c4060d40184bb2e"} Apr 21 15:36:40.896217 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:40.896189 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-hdmgw" event={"ID":"f12a02e9-41ef-4e4b-913a-8246dfcb282b","Type":"ContainerStarted","Data":"5e214aec078b024a2eb7b14eff55189f542d520f3a5cb31dab33b2062a621b3d"} Apr 21 15:36:40.896488 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:40.896454 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-hdmgw" Apr 21 15:36:40.900059 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:40.900031 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pjqcl" event={"ID":"ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58","Type":"ContainerStarted","Data":"ea260c1e50e03be840b3cbdca19f89f4d129d2aea5a383588f3203373b53ec48"} Apr 21 15:36:40.900186 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:40.900075 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pjqcl" event={"ID":"ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58","Type":"ContainerStarted","Data":"a710073eed74479c4417a2a3f0aaea09290c02180576f2cd16b597a803e3f3a4"} Apr 21 15:36:40.901718 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:40.901696 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jscgp" event={"ID":"a429572e-f646-4624-aaee-489752ccaffb","Type":"ContainerStarted","Data":"0b822eccf71ef6b190aeedbff532064b0694a0451751fd37bddc09f30c6821b6"} Apr 21 15:36:40.903124 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:40.903091 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-8b68b5dff-cfnl7" event={"ID":"23a4ea67-b844-43a1-bd74-fb2d6787d688","Type":"ContainerStarted","Data":"dcf43b4c74d5a6a99709127d5eee5d82d633d48012483946f13ba57c4fec87ff"} Apr 21 15:36:40.904598 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:40.904571 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wtsst" event={"ID":"0f967e80-450d-4360-9a92-a5407235a3a9","Type":"ContainerStarted","Data":"2d45ba39523dd8d8c95038d465f83b8b2f2ea425f275055b23bc1f0b72c87f30"} Apr 21 15:36:40.906357 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:40.906331 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9cb554df-hnj7z" event={"ID":"31fbf463-1da8-4893-9dd4-c1b4403a1294","Type":"ContainerStarted","Data":"72865c54c8d90d99eb3029534d73d27437dfa2d4dc16a36077ee382fbe76eb46"} Apr 21 15:36:40.910238 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:40.910218 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-hdmgw" Apr 21 15:36:40.917101 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:40.917047 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sgdpn" podStartSLOduration=3.246980823 podStartE2EDuration="10.917033717s" podCreationTimestamp="2026-04-21 15:36:30 +0000 UTC" firstStartedPulling="2026-04-21 15:36:32.364988439 +0000 UTC m=+69.467440038" lastFinishedPulling="2026-04-21 15:36:40.035041314 +0000 UTC m=+77.137492932" observedRunningTime="2026-04-21 15:36:40.915411507 +0000 UTC m=+78.017863131" watchObservedRunningTime="2026-04-21 15:36:40.917033717 +0000 UTC m=+78.019485341" Apr 21 15:36:40.938186 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:40.938067 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-hdmgw" podStartSLOduration=1.9456431969999999 podStartE2EDuration="21.938050359s" podCreationTimestamp="2026-04-21 15:36:19 +0000 UTC" firstStartedPulling="2026-04-21 15:36:20.092000999 +0000 UTC m=+57.194452598" lastFinishedPulling="2026-04-21 15:36:40.084408162 +0000 UTC m=+77.186859760" observedRunningTime="2026-04-21 15:36:40.935028763 +0000 UTC m=+78.037480375" watchObservedRunningTime="2026-04-21 15:36:40.938050359 +0000 UTC m=+78.040501982" Apr 21 15:36:40.964677 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:40.964623 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-pjqcl" podStartSLOduration=10.105123092 podStartE2EDuration="10.964609455s" podCreationTimestamp="2026-04-21 15:36:30 +0000 UTC" firstStartedPulling="2026-04-21 15:36:31.9042642 +0000 UTC m=+69.006715807" lastFinishedPulling="2026-04-21 15:36:32.763750554 +0000 UTC m=+69.866202170" observedRunningTime="2026-04-21 15:36:40.964344337 +0000 UTC m=+78.066795980" watchObservedRunningTime="2026-04-21 15:36:40.964609455 +0000 UTC m=+78.067061076" Apr 21 15:36:41.918202 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:41.917723 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jscgp" event={"ID":"a429572e-f646-4624-aaee-489752ccaffb","Type":"ContainerStarted","Data":"a7d85795f0ef0a9d296b4d05effa3d197e64be157291725c2051b0746d77d73c"} Apr 21 15:36:42.815402 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:42.815367 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-789665c8f5-2khmb" Apr 21 15:36:45.932077 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:45.932035 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jscgp" event={"ID":"a429572e-f646-4624-aaee-489752ccaffb","Type":"ContainerStarted","Data":"5f693d7a0fd6855f8e7de294a86677dbde2ed2a1248a54da20ac3bc8f7c03917"} Apr 21 15:36:45.933655 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:45.933612 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-8b68b5dff-cfnl7" event={"ID":"23a4ea67-b844-43a1-bd74-fb2d6787d688","Type":"ContainerStarted","Data":"42223120ee7490f1775ce657af3759dd850553fa3a9f8d8085413df0f3ab353d"} Apr 21 15:36:45.935150 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:45.935108 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wtsst" event={"ID":"0f967e80-450d-4360-9a92-a5407235a3a9","Type":"ContainerStarted","Data":"43b8199f3dc9e782d1fc7da2771d61ffb2f4d082130a33f7ddc6488971502e37"} Apr 21 15:36:45.935583 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:45.935550 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wtsst" Apr 21 15:36:45.937370 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:45.937339 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9cb554df-hnj7z" event={"ID":"31fbf463-1da8-4893-9dd4-c1b4403a1294","Type":"ContainerStarted","Data":"00260f23b51afee0f2213034d577736c4fa1641f8f9e75c911d2d5429ed13c37"} Apr 21 15:36:45.945747 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:45.945726 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wtsst" Apr 21 15:36:45.957948 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:45.957898 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-jscgp" podStartSLOduration=9.578459257 podStartE2EDuration="14.957884281s" podCreationTimestamp="2026-04-21 15:36:31 +0000 UTC" firstStartedPulling="2026-04-21 15:36:40.13624979 +0000 UTC m=+77.238701403" lastFinishedPulling="2026-04-21 15:36:45.515674815 +0000 UTC m=+82.618126427" observedRunningTime="2026-04-21 15:36:45.955948083 +0000 UTC m=+83.058399706" watchObservedRunningTime="2026-04-21 15:36:45.957884281 +0000 UTC m=+83.060335903" Apr 21 15:36:45.978815 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:45.978760 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-8b68b5dff-cfnl7" podStartSLOduration=5.707234462 podStartE2EDuration="10.978746422s" podCreationTimestamp="2026-04-21 15:36:35 +0000 UTC" firstStartedPulling="2026-04-21 15:36:40.218956086 +0000 UTC m=+77.321407685" lastFinishedPulling="2026-04-21 15:36:45.490468031 +0000 UTC m=+82.592919645" observedRunningTime="2026-04-21 15:36:45.978069407 +0000 UTC m=+83.080521027" watchObservedRunningTime="2026-04-21 15:36:45.978746422 +0000 UTC m=+83.081198043" Apr 21 15:36:45.995815 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:45.995766 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wtsst" podStartSLOduration=5.909838656 podStartE2EDuration="10.99574786s" podCreationTimestamp="2026-04-21 15:36:35 +0000 UTC" firstStartedPulling="2026-04-21 15:36:40.429736359 +0000 UTC m=+77.532187961" lastFinishedPulling="2026-04-21 15:36:45.515645564 +0000 UTC m=+82.618097165" observedRunningTime="2026-04-21 15:36:45.994438613 +0000 UTC m=+83.096890238" watchObservedRunningTime="2026-04-21 15:36:45.99574786 +0000 UTC m=+83.098199484" Apr 21 15:36:46.017165 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:46.017093 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9cb554df-hnj7z" podStartSLOduration=5.927835079 podStartE2EDuration="11.01707502s" podCreationTimestamp="2026-04-21 15:36:35 +0000 UTC" firstStartedPulling="2026-04-21 15:36:40.433289818 +0000 UTC m=+77.535741417" lastFinishedPulling="2026-04-21 15:36:45.52252976 +0000 UTC m=+82.624981358" observedRunningTime="2026-04-21 15:36:46.014817394 +0000 UTC m=+83.117269043" watchObservedRunningTime="2026-04-21 15:36:46.01707502 +0000 UTC m=+83.119526643" Apr 21 15:36:46.100180 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:46.100107 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-f9cb554df-hnj7z" Apr 21 15:36:55.625190 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:55.625124 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-8b68b5dff-cfnl7" Apr 21 15:36:55.625190 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:36:55.625193 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-8b68b5dff-cfnl7" Apr 21 15:37:10.963861 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:37:10.963797 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-f9cb554df-hnj7z" podUID="31fbf463-1da8-4893-9dd4-c1b4403a1294" containerName="console" containerID="cri-o://00260f23b51afee0f2213034d577736c4fa1641f8f9e75c911d2d5429ed13c37" gracePeriod=15 Apr 21 15:37:11.198627 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:37:11.198606 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9cb554df-hnj7z_31fbf463-1da8-4893-9dd4-c1b4403a1294/console/0.log" Apr 21 15:37:11.198737 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:37:11.198675 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9cb554df-hnj7z" Apr 21 15:37:11.285334 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:37:11.285240 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31fbf463-1da8-4893-9dd4-c1b4403a1294-trusted-ca-bundle\") pod \"31fbf463-1da8-4893-9dd4-c1b4403a1294\" (UID: \"31fbf463-1da8-4893-9dd4-c1b4403a1294\") " Apr 21 15:37:11.285334 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:37:11.285284 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/31fbf463-1da8-4893-9dd4-c1b4403a1294-service-ca\") pod \"31fbf463-1da8-4893-9dd4-c1b4403a1294\" (UID: \"31fbf463-1da8-4893-9dd4-c1b4403a1294\") " Apr 21 15:37:11.285334 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:37:11.285318 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5khgr\" (UniqueName: \"kubernetes.io/projected/31fbf463-1da8-4893-9dd4-c1b4403a1294-kube-api-access-5khgr\") pod \"31fbf463-1da8-4893-9dd4-c1b4403a1294\" (UID: \"31fbf463-1da8-4893-9dd4-c1b4403a1294\") " Apr 21 15:37:11.285604 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:37:11.285339 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/31fbf463-1da8-4893-9dd4-c1b4403a1294-oauth-serving-cert\") pod \"31fbf463-1da8-4893-9dd4-c1b4403a1294\" (UID: \"31fbf463-1da8-4893-9dd4-c1b4403a1294\") " Apr 21 15:37:11.285604 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:37:11.285373 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/31fbf463-1da8-4893-9dd4-c1b4403a1294-console-config\") pod \"31fbf463-1da8-4893-9dd4-c1b4403a1294\" (UID: \"31fbf463-1da8-4893-9dd4-c1b4403a1294\") " Apr 21 15:37:11.285604 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:37:11.285400 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/31fbf463-1da8-4893-9dd4-c1b4403a1294-console-serving-cert\") pod \"31fbf463-1da8-4893-9dd4-c1b4403a1294\" (UID: \"31fbf463-1da8-4893-9dd4-c1b4403a1294\") " Apr 21 15:37:11.285604 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:37:11.285430 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/31fbf463-1da8-4893-9dd4-c1b4403a1294-console-oauth-config\") pod \"31fbf463-1da8-4893-9dd4-c1b4403a1294\" (UID: \"31fbf463-1da8-4893-9dd4-c1b4403a1294\") " Apr 21 15:37:11.285834 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:37:11.285779 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31fbf463-1da8-4893-9dd4-c1b4403a1294-service-ca" (OuterVolumeSpecName: "service-ca") pod "31fbf463-1da8-4893-9dd4-c1b4403a1294" (UID: "31fbf463-1da8-4893-9dd4-c1b4403a1294"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:37:11.285896 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:37:11.285822 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31fbf463-1da8-4893-9dd4-c1b4403a1294-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "31fbf463-1da8-4893-9dd4-c1b4403a1294" (UID: "31fbf463-1da8-4893-9dd4-c1b4403a1294"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:37:11.285977 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:37:11.285947 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31fbf463-1da8-4893-9dd4-c1b4403a1294-console-config" (OuterVolumeSpecName: "console-config") pod "31fbf463-1da8-4893-9dd4-c1b4403a1294" (UID: "31fbf463-1da8-4893-9dd4-c1b4403a1294"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:37:11.286211 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:37:11.286184 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31fbf463-1da8-4893-9dd4-c1b4403a1294-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "31fbf463-1da8-4893-9dd4-c1b4403a1294" (UID: "31fbf463-1da8-4893-9dd4-c1b4403a1294"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:37:11.287736 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:37:11.287682 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31fbf463-1da8-4893-9dd4-c1b4403a1294-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "31fbf463-1da8-4893-9dd4-c1b4403a1294" (UID: "31fbf463-1da8-4893-9dd4-c1b4403a1294"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:37:11.287736 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:37:11.287719 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31fbf463-1da8-4893-9dd4-c1b4403a1294-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "31fbf463-1da8-4893-9dd4-c1b4403a1294" (UID: "31fbf463-1da8-4893-9dd4-c1b4403a1294"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:37:11.287866 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:37:11.287772 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31fbf463-1da8-4893-9dd4-c1b4403a1294-kube-api-access-5khgr" (OuterVolumeSpecName: "kube-api-access-5khgr") pod "31fbf463-1da8-4893-9dd4-c1b4403a1294" (UID: "31fbf463-1da8-4893-9dd4-c1b4403a1294"). InnerVolumeSpecName "kube-api-access-5khgr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:37:11.386464 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:37:11.386422 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31fbf463-1da8-4893-9dd4-c1b4403a1294-trusted-ca-bundle\") on node \"ip-10-0-133-158.ec2.internal\" DevicePath \"\"" Apr 21 15:37:11.386464 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:37:11.386457 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/31fbf463-1da8-4893-9dd4-c1b4403a1294-service-ca\") on node \"ip-10-0-133-158.ec2.internal\" DevicePath \"\"" Apr 21 15:37:11.386464 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:37:11.386468 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5khgr\" (UniqueName: \"kubernetes.io/projected/31fbf463-1da8-4893-9dd4-c1b4403a1294-kube-api-access-5khgr\") on node \"ip-10-0-133-158.ec2.internal\" DevicePath \"\"" Apr 21 15:37:11.386709 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:37:11.386478 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/31fbf463-1da8-4893-9dd4-c1b4403a1294-oauth-serving-cert\") on node \"ip-10-0-133-158.ec2.internal\" DevicePath \"\"" Apr 21 15:37:11.386709 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:37:11.386488 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/31fbf463-1da8-4893-9dd4-c1b4403a1294-console-config\") on node \"ip-10-0-133-158.ec2.internal\" DevicePath \"\"" Apr 21 15:37:11.386709 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:37:11.386496 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/31fbf463-1da8-4893-9dd4-c1b4403a1294-console-serving-cert\") on node \"ip-10-0-133-158.ec2.internal\" DevicePath \"\"" Apr 21 15:37:11.386709 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:37:11.386505 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/31fbf463-1da8-4893-9dd4-c1b4403a1294-console-oauth-config\") on node \"ip-10-0-133-158.ec2.internal\" DevicePath \"\"" Apr 21 15:37:12.008936 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:37:12.008910 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9cb554df-hnj7z_31fbf463-1da8-4893-9dd4-c1b4403a1294/console/0.log" Apr 21 15:37:12.009328 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:37:12.008960 2569 generic.go:358] "Generic (PLEG): container finished" podID="31fbf463-1da8-4893-9dd4-c1b4403a1294" containerID="00260f23b51afee0f2213034d577736c4fa1641f8f9e75c911d2d5429ed13c37" exitCode=2 Apr 21 15:37:12.009328 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:37:12.009020 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9cb554df-hnj7z" event={"ID":"31fbf463-1da8-4893-9dd4-c1b4403a1294","Type":"ContainerDied","Data":"00260f23b51afee0f2213034d577736c4fa1641f8f9e75c911d2d5429ed13c37"} Apr 21 15:37:12.009328 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:37:12.009033 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9cb554df-hnj7z" Apr 21 15:37:12.009328 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:37:12.009055 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9cb554df-hnj7z" event={"ID":"31fbf463-1da8-4893-9dd4-c1b4403a1294","Type":"ContainerDied","Data":"72865c54c8d90d99eb3029534d73d27437dfa2d4dc16a36077ee382fbe76eb46"} Apr 21 15:37:12.009328 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:37:12.009076 2569 scope.go:117] "RemoveContainer" containerID="00260f23b51afee0f2213034d577736c4fa1641f8f9e75c911d2d5429ed13c37" Apr 21 15:37:12.016543 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:37:12.016524 2569 scope.go:117] "RemoveContainer" containerID="00260f23b51afee0f2213034d577736c4fa1641f8f9e75c911d2d5429ed13c37" Apr 21 15:37:12.016764 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:37:12.016747 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00260f23b51afee0f2213034d577736c4fa1641f8f9e75c911d2d5429ed13c37\": container with ID starting with 00260f23b51afee0f2213034d577736c4fa1641f8f9e75c911d2d5429ed13c37 not found: ID does not exist" containerID="00260f23b51afee0f2213034d577736c4fa1641f8f9e75c911d2d5429ed13c37" Apr 21 15:37:12.016814 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:37:12.016771 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00260f23b51afee0f2213034d577736c4fa1641f8f9e75c911d2d5429ed13c37"} err="failed to get container status \"00260f23b51afee0f2213034d577736c4fa1641f8f9e75c911d2d5429ed13c37\": rpc error: code = NotFound desc = could not find container \"00260f23b51afee0f2213034d577736c4fa1641f8f9e75c911d2d5429ed13c37\": container with ID starting with 00260f23b51afee0f2213034d577736c4fa1641f8f9e75c911d2d5429ed13c37 not found: ID does not exist" Apr 21 15:37:12.028969 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:37:12.028932 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9cb554df-hnj7z"] Apr 21 15:37:12.034841 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:37:12.034818 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9cb554df-hnj7z"] Apr 21 15:37:13.547838 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:37:13.547798 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31fbf463-1da8-4893-9dd4-c1b4403a1294" path="/var/lib/kubelet/pods/31fbf463-1da8-4893-9dd4-c1b4403a1294/volumes" Apr 21 15:37:15.630230 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:37:15.630201 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-8b68b5dff-cfnl7" Apr 21 15:37:15.634021 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:37:15.633997 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-8b68b5dff-cfnl7" Apr 21 15:38:28.818550 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:38:28.818473 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-2p6kz"] Apr 21 15:38:28.819002 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:38:28.818728 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="31fbf463-1da8-4893-9dd4-c1b4403a1294" containerName="console" Apr 21 15:38:28.819002 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:38:28.818738 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="31fbf463-1da8-4893-9dd4-c1b4403a1294" containerName="console" Apr 21 15:38:28.819002 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:38:28.818784 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="31fbf463-1da8-4893-9dd4-c1b4403a1294" containerName="console" Apr 21 15:38:28.821804 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:38:28.821765 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2p6kz" Apr 21 15:38:28.824585 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:38:28.824565 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 15:38:28.829246 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:38:28.829221 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-2p6kz"] Apr 21 15:38:28.878330 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:38:28.878292 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/08cd6fc2-7059-4977-908e-bd91ccd03110-dbus\") pod \"global-pull-secret-syncer-2p6kz\" (UID: \"08cd6fc2-7059-4977-908e-bd91ccd03110\") " pod="kube-system/global-pull-secret-syncer-2p6kz" Apr 21 15:38:28.878482 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:38:28.878372 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/08cd6fc2-7059-4977-908e-bd91ccd03110-original-pull-secret\") pod \"global-pull-secret-syncer-2p6kz\" (UID: \"08cd6fc2-7059-4977-908e-bd91ccd03110\") " pod="kube-system/global-pull-secret-syncer-2p6kz" Apr 21 15:38:28.878482 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:38:28.878429 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/08cd6fc2-7059-4977-908e-bd91ccd03110-kubelet-config\") pod \"global-pull-secret-syncer-2p6kz\" (UID: \"08cd6fc2-7059-4977-908e-bd91ccd03110\") " pod="kube-system/global-pull-secret-syncer-2p6kz" Apr 21 15:38:28.978732 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:38:28.978690 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/08cd6fc2-7059-4977-908e-bd91ccd03110-original-pull-secret\") pod \"global-pull-secret-syncer-2p6kz\" (UID: \"08cd6fc2-7059-4977-908e-bd91ccd03110\") " pod="kube-system/global-pull-secret-syncer-2p6kz" Apr 21 15:38:28.978913 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:38:28.978756 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/08cd6fc2-7059-4977-908e-bd91ccd03110-kubelet-config\") pod \"global-pull-secret-syncer-2p6kz\" (UID: \"08cd6fc2-7059-4977-908e-bd91ccd03110\") " pod="kube-system/global-pull-secret-syncer-2p6kz" Apr 21 15:38:28.978913 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:38:28.978807 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/08cd6fc2-7059-4977-908e-bd91ccd03110-dbus\") pod \"global-pull-secret-syncer-2p6kz\" (UID: \"08cd6fc2-7059-4977-908e-bd91ccd03110\") " pod="kube-system/global-pull-secret-syncer-2p6kz" Apr 21 15:38:28.978913 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:38:28.978892 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/08cd6fc2-7059-4977-908e-bd91ccd03110-kubelet-config\") pod \"global-pull-secret-syncer-2p6kz\" (UID: \"08cd6fc2-7059-4977-908e-bd91ccd03110\") " pod="kube-system/global-pull-secret-syncer-2p6kz" Apr 21 15:38:28.979018 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:38:28.978970 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/08cd6fc2-7059-4977-908e-bd91ccd03110-dbus\") pod \"global-pull-secret-syncer-2p6kz\" (UID: \"08cd6fc2-7059-4977-908e-bd91ccd03110\") " pod="kube-system/global-pull-secret-syncer-2p6kz" Apr 21 15:38:28.981004 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:38:28.980982 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/08cd6fc2-7059-4977-908e-bd91ccd03110-original-pull-secret\") pod \"global-pull-secret-syncer-2p6kz\" (UID: \"08cd6fc2-7059-4977-908e-bd91ccd03110\") " pod="kube-system/global-pull-secret-syncer-2p6kz" Apr 21 15:38:29.131178 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:38:29.131073 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2p6kz" Apr 21 15:38:29.249512 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:38:29.249478 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-2p6kz"] Apr 21 15:38:29.252352 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:38:29.252324 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08cd6fc2_7059_4977_908e_bd91ccd03110.slice/crio-69e3fdec4841861f2b82bc0d68b1d8d789b9bb0e9838371bb7055235f6f70904 WatchSource:0}: Error finding container 69e3fdec4841861f2b82bc0d68b1d8d789b9bb0e9838371bb7055235f6f70904: Status 404 returned error can't find the container with id 69e3fdec4841861f2b82bc0d68b1d8d789b9bb0e9838371bb7055235f6f70904 Apr 21 15:38:30.220525 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:38:30.220479 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-2p6kz" event={"ID":"08cd6fc2-7059-4977-908e-bd91ccd03110","Type":"ContainerStarted","Data":"69e3fdec4841861f2b82bc0d68b1d8d789b9bb0e9838371bb7055235f6f70904"} Apr 21 15:38:34.233225 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:38:34.233184 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-2p6kz" event={"ID":"08cd6fc2-7059-4977-908e-bd91ccd03110","Type":"ContainerStarted","Data":"953656b85b380b37b3c606d60c37627071d57163fee911740a64735097b29ee3"} Apr 21 15:38:34.260382 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:38:34.260335 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-2p6kz" podStartSLOduration=2.3187018520000002 podStartE2EDuration="6.260321323s" podCreationTimestamp="2026-04-21 15:38:28 +0000 UTC" firstStartedPulling="2026-04-21 15:38:29.253949141 +0000 UTC m=+186.356400740" lastFinishedPulling="2026-04-21 15:38:33.195568612 +0000 UTC m=+190.298020211" observedRunningTime="2026-04-21 15:38:34.258021676 +0000 UTC m=+191.360473296" watchObservedRunningTime="2026-04-21 15:38:34.260321323 +0000 UTC m=+191.362772942" Apr 21 15:41:12.618368 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:12.618282 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-6b94ff949c-vtgsb"] Apr 21 15:41:12.621454 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:12.621430 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6b94ff949c-vtgsb" Apr 21 15:41:12.624205 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:12.624166 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 21 15:41:12.624205 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:12.624186 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 21 15:41:12.624466 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:12.624450 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-b86m6\"" Apr 21 15:41:12.625468 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:12.625445 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 21 15:41:12.633102 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:12.633079 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-6b94ff949c-vtgsb"] Apr 21 15:41:12.681061 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:12.681017 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/797464d7-ee23-4c4d-908d-ba863a1e2e8f-cert\") pod \"llmisvc-controller-manager-6b94ff949c-vtgsb\" (UID: \"797464d7-ee23-4c4d-908d-ba863a1e2e8f\") " pod="kserve/llmisvc-controller-manager-6b94ff949c-vtgsb" Apr 21 15:41:12.681268 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:12.681080 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9mxd\" (UniqueName: \"kubernetes.io/projected/797464d7-ee23-4c4d-908d-ba863a1e2e8f-kube-api-access-f9mxd\") pod \"llmisvc-controller-manager-6b94ff949c-vtgsb\" (UID: \"797464d7-ee23-4c4d-908d-ba863a1e2e8f\") " pod="kserve/llmisvc-controller-manager-6b94ff949c-vtgsb" Apr 21 15:41:12.781756 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:12.781717 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f9mxd\" (UniqueName: \"kubernetes.io/projected/797464d7-ee23-4c4d-908d-ba863a1e2e8f-kube-api-access-f9mxd\") pod \"llmisvc-controller-manager-6b94ff949c-vtgsb\" (UID: \"797464d7-ee23-4c4d-908d-ba863a1e2e8f\") " pod="kserve/llmisvc-controller-manager-6b94ff949c-vtgsb" Apr 21 15:41:12.781931 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:12.781821 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/797464d7-ee23-4c4d-908d-ba863a1e2e8f-cert\") pod \"llmisvc-controller-manager-6b94ff949c-vtgsb\" (UID: \"797464d7-ee23-4c4d-908d-ba863a1e2e8f\") " pod="kserve/llmisvc-controller-manager-6b94ff949c-vtgsb" Apr 21 15:41:12.781996 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:41:12.781952 2569 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 21 15:41:12.782102 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:41:12.782029 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/797464d7-ee23-4c4d-908d-ba863a1e2e8f-cert podName:797464d7-ee23-4c4d-908d-ba863a1e2e8f nodeName:}" failed. No retries permitted until 2026-04-21 15:41:13.282006751 +0000 UTC m=+350.384458353 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/797464d7-ee23-4c4d-908d-ba863a1e2e8f-cert") pod "llmisvc-controller-manager-6b94ff949c-vtgsb" (UID: "797464d7-ee23-4c4d-908d-ba863a1e2e8f") : secret "llmisvc-webhook-server-cert" not found Apr 21 15:41:12.791947 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:12.791922 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9mxd\" (UniqueName: \"kubernetes.io/projected/797464d7-ee23-4c4d-908d-ba863a1e2e8f-kube-api-access-f9mxd\") pod \"llmisvc-controller-manager-6b94ff949c-vtgsb\" (UID: \"797464d7-ee23-4c4d-908d-ba863a1e2e8f\") " pod="kserve/llmisvc-controller-manager-6b94ff949c-vtgsb" Apr 21 15:41:13.286388 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:13.286352 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/797464d7-ee23-4c4d-908d-ba863a1e2e8f-cert\") pod \"llmisvc-controller-manager-6b94ff949c-vtgsb\" (UID: \"797464d7-ee23-4c4d-908d-ba863a1e2e8f\") " pod="kserve/llmisvc-controller-manager-6b94ff949c-vtgsb" Apr 21 15:41:13.288789 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:13.288754 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/797464d7-ee23-4c4d-908d-ba863a1e2e8f-cert\") pod \"llmisvc-controller-manager-6b94ff949c-vtgsb\" (UID: \"797464d7-ee23-4c4d-908d-ba863a1e2e8f\") " pod="kserve/llmisvc-controller-manager-6b94ff949c-vtgsb" Apr 21 15:41:13.532718 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:13.532686 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6b94ff949c-vtgsb" Apr 21 15:41:13.655362 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:13.655335 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-6b94ff949c-vtgsb"] Apr 21 15:41:13.658297 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:41:13.658268 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod797464d7_ee23_4c4d_908d_ba863a1e2e8f.slice/crio-447b3deddf125ece9ecc28dffa2f9ed93eaa2c988e731eceaf879e1b6221942c WatchSource:0}: Error finding container 447b3deddf125ece9ecc28dffa2f9ed93eaa2c988e731eceaf879e1b6221942c: Status 404 returned error can't find the container with id 447b3deddf125ece9ecc28dffa2f9ed93eaa2c988e731eceaf879e1b6221942c Apr 21 15:41:13.659501 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:13.659486 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:41:14.654237 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:14.654199 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6b94ff949c-vtgsb" event={"ID":"797464d7-ee23-4c4d-908d-ba863a1e2e8f","Type":"ContainerStarted","Data":"447b3deddf125ece9ecc28dffa2f9ed93eaa2c988e731eceaf879e1b6221942c"} Apr 21 15:41:16.660790 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:16.660750 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6b94ff949c-vtgsb" event={"ID":"797464d7-ee23-4c4d-908d-ba863a1e2e8f","Type":"ContainerStarted","Data":"ac0dc13dd8e9e27d6cc33d20463e3f43ea4bbfab9e2cb9c3d8d0e07aff06a314"} Apr 21 15:41:16.661211 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:16.660884 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-6b94ff949c-vtgsb" Apr 21 15:41:16.681374 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:16.681321 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-6b94ff949c-vtgsb" podStartSLOduration=2.203334011 podStartE2EDuration="4.681303639s" podCreationTimestamp="2026-04-21 15:41:12 +0000 UTC" firstStartedPulling="2026-04-21 15:41:13.659604644 +0000 UTC m=+350.762056243" lastFinishedPulling="2026-04-21 15:41:16.137574272 +0000 UTC m=+353.240025871" observedRunningTime="2026-04-21 15:41:16.680579566 +0000 UTC m=+353.783031187" watchObservedRunningTime="2026-04-21 15:41:16.681303639 +0000 UTC m=+353.783755259" Apr 21 15:41:47.667963 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:47.667931 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-6b94ff949c-vtgsb" Apr 21 15:41:49.004251 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:49.004210 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-9c85dd4d8-5f58h"] Apr 21 15:41:49.008446 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:49.008423 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-9c85dd4d8-5f58h" Apr 21 15:41:49.011925 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:49.011905 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-s4rzm\"" Apr 21 15:41:49.012028 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:49.011938 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 21 15:41:49.018237 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:49.018202 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-9c85dd4d8-5f58h"] Apr 21 15:41:49.058511 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:49.058473 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfmkn\" (UniqueName: \"kubernetes.io/projected/3ae42f61-7b5a-4286-a54e-03b6f3fb89f3-kube-api-access-nfmkn\") pod \"kserve-controller-manager-9c85dd4d8-5f58h\" (UID: \"3ae42f61-7b5a-4286-a54e-03b6f3fb89f3\") " pod="kserve/kserve-controller-manager-9c85dd4d8-5f58h" Apr 21 15:41:49.058680 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:49.058558 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ae42f61-7b5a-4286-a54e-03b6f3fb89f3-cert\") pod \"kserve-controller-manager-9c85dd4d8-5f58h\" (UID: \"3ae42f61-7b5a-4286-a54e-03b6f3fb89f3\") " pod="kserve/kserve-controller-manager-9c85dd4d8-5f58h" Apr 21 15:41:49.159109 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:49.159065 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nfmkn\" (UniqueName: \"kubernetes.io/projected/3ae42f61-7b5a-4286-a54e-03b6f3fb89f3-kube-api-access-nfmkn\") pod \"kserve-controller-manager-9c85dd4d8-5f58h\" (UID: \"3ae42f61-7b5a-4286-a54e-03b6f3fb89f3\") " pod="kserve/kserve-controller-manager-9c85dd4d8-5f58h" Apr 21 15:41:49.159307 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:49.159152 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ae42f61-7b5a-4286-a54e-03b6f3fb89f3-cert\") pod \"kserve-controller-manager-9c85dd4d8-5f58h\" (UID: \"3ae42f61-7b5a-4286-a54e-03b6f3fb89f3\") " pod="kserve/kserve-controller-manager-9c85dd4d8-5f58h" Apr 21 15:41:49.161558 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:49.161534 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ae42f61-7b5a-4286-a54e-03b6f3fb89f3-cert\") pod \"kserve-controller-manager-9c85dd4d8-5f58h\" (UID: \"3ae42f61-7b5a-4286-a54e-03b6f3fb89f3\") " pod="kserve/kserve-controller-manager-9c85dd4d8-5f58h" Apr 21 15:41:49.169092 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:49.169066 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfmkn\" (UniqueName: \"kubernetes.io/projected/3ae42f61-7b5a-4286-a54e-03b6f3fb89f3-kube-api-access-nfmkn\") pod \"kserve-controller-manager-9c85dd4d8-5f58h\" (UID: \"3ae42f61-7b5a-4286-a54e-03b6f3fb89f3\") " pod="kserve/kserve-controller-manager-9c85dd4d8-5f58h" Apr 21 15:41:49.319523 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:49.319441 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-9c85dd4d8-5f58h" Apr 21 15:41:49.452505 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:49.452480 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-9c85dd4d8-5f58h"] Apr 21 15:41:49.455047 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:41:49.455017 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ae42f61_7b5a_4286_a54e_03b6f3fb89f3.slice/crio-54befff925d9ca4b580a626fa219f61bd15add692e3e2b73bdc821a3d4596cd5 WatchSource:0}: Error finding container 54befff925d9ca4b580a626fa219f61bd15add692e3e2b73bdc821a3d4596cd5: Status 404 returned error can't find the container with id 54befff925d9ca4b580a626fa219f61bd15add692e3e2b73bdc821a3d4596cd5 Apr 21 15:41:49.747146 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:49.747107 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-9c85dd4d8-5f58h" event={"ID":"3ae42f61-7b5a-4286-a54e-03b6f3fb89f3","Type":"ContainerStarted","Data":"54befff925d9ca4b580a626fa219f61bd15add692e3e2b73bdc821a3d4596cd5"} Apr 21 15:41:52.756755 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:52.756722 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-9c85dd4d8-5f58h" event={"ID":"3ae42f61-7b5a-4286-a54e-03b6f3fb89f3","Type":"ContainerStarted","Data":"3956247ad2c9ae53cd7d82370ed5b6d178d7f8bbd071f2a4260698334b2c8e15"} Apr 21 15:41:52.757244 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:52.756839 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-9c85dd4d8-5f58h" Apr 21 15:41:52.780565 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:41:52.780510 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-9c85dd4d8-5f58h" podStartSLOduration=2.340952276 podStartE2EDuration="4.780492848s" podCreationTimestamp="2026-04-21 15:41:48 +0000 UTC" firstStartedPulling="2026-04-21 15:41:49.456275409 +0000 UTC m=+386.558727007" lastFinishedPulling="2026-04-21 15:41:51.89581598 +0000 UTC m=+388.998267579" observedRunningTime="2026-04-21 15:41:52.779274034 +0000 UTC m=+389.881725655" watchObservedRunningTime="2026-04-21 15:41:52.780492848 +0000 UTC m=+389.882944470" Apr 21 15:42:23.765306 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:42:23.765275 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-9c85dd4d8-5f58h" Apr 21 15:46:16.536759 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:16.536720 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-4bd9d-b569d849-26rp9"] Apr 21 15:46:16.538616 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:16.538597 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-4bd9d-b569d849-26rp9" Apr 21 15:46:16.541039 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:16.541012 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-4bd9d-serving-cert\"" Apr 21 15:46:16.541351 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:16.541335 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 21 15:46:16.541829 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:16.541809 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-4bd9d-kube-rbac-proxy-sar-config\"" Apr 21 15:46:16.541915 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:16.541811 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-xqtsr\"" Apr 21 15:46:16.552793 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:16.552768 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-4bd9d-b569d849-26rp9"] Apr 21 15:46:16.702891 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:16.702853 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30869df4-7634-4299-ac1b-cbd8660110fc-proxy-tls\") pod \"model-chainer-raw-4bd9d-b569d849-26rp9\" (UID: \"30869df4-7634-4299-ac1b-cbd8660110fc\") " pod="kserve-ci-e2e-test/model-chainer-raw-4bd9d-b569d849-26rp9" Apr 21 15:46:16.703057 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:16.702983 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30869df4-7634-4299-ac1b-cbd8660110fc-openshift-service-ca-bundle\") pod \"model-chainer-raw-4bd9d-b569d849-26rp9\" (UID: \"30869df4-7634-4299-ac1b-cbd8660110fc\") " pod="kserve-ci-e2e-test/model-chainer-raw-4bd9d-b569d849-26rp9" Apr 21 15:46:16.804277 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:16.804182 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30869df4-7634-4299-ac1b-cbd8660110fc-openshift-service-ca-bundle\") pod \"model-chainer-raw-4bd9d-b569d849-26rp9\" (UID: \"30869df4-7634-4299-ac1b-cbd8660110fc\") " pod="kserve-ci-e2e-test/model-chainer-raw-4bd9d-b569d849-26rp9" Apr 21 15:46:16.804277 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:16.804229 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30869df4-7634-4299-ac1b-cbd8660110fc-proxy-tls\") pod \"model-chainer-raw-4bd9d-b569d849-26rp9\" (UID: \"30869df4-7634-4299-ac1b-cbd8660110fc\") " pod="kserve-ci-e2e-test/model-chainer-raw-4bd9d-b569d849-26rp9" Apr 21 15:46:16.804908 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:16.804871 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30869df4-7634-4299-ac1b-cbd8660110fc-openshift-service-ca-bundle\") pod \"model-chainer-raw-4bd9d-b569d849-26rp9\" (UID: \"30869df4-7634-4299-ac1b-cbd8660110fc\") " pod="kserve-ci-e2e-test/model-chainer-raw-4bd9d-b569d849-26rp9" Apr 21 15:46:16.806550 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:16.806522 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30869df4-7634-4299-ac1b-cbd8660110fc-proxy-tls\") pod \"model-chainer-raw-4bd9d-b569d849-26rp9\" (UID: \"30869df4-7634-4299-ac1b-cbd8660110fc\") " pod="kserve-ci-e2e-test/model-chainer-raw-4bd9d-b569d849-26rp9" Apr 21 15:46:16.849184 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:16.849150 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-4bd9d-b569d849-26rp9" Apr 21 15:46:16.978317 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:16.978187 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-4bd9d-b569d849-26rp9"] Apr 21 15:46:16.981120 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:46:16.981093 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30869df4_7634_4299_ac1b_cbd8660110fc.slice/crio-39799f4490af8d509c10a359a6ce894ac58d3a68fd0b25788034d91de1cdce0d WatchSource:0}: Error finding container 39799f4490af8d509c10a359a6ce894ac58d3a68fd0b25788034d91de1cdce0d: Status 404 returned error can't find the container with id 39799f4490af8d509c10a359a6ce894ac58d3a68fd0b25788034d91de1cdce0d Apr 21 15:46:16.983321 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:16.983307 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:46:17.488524 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:17.488488 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-4bd9d-b569d849-26rp9" event={"ID":"30869df4-7634-4299-ac1b-cbd8660110fc","Type":"ContainerStarted","Data":"39799f4490af8d509c10a359a6ce894ac58d3a68fd0b25788034d91de1cdce0d"} Apr 21 15:46:19.494303 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:19.494267 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-4bd9d-b569d849-26rp9" event={"ID":"30869df4-7634-4299-ac1b-cbd8660110fc","Type":"ContainerStarted","Data":"1a41eb2e33adad599c7a77251c3f2951a88c3d1e3fd45ac2105f4a676051c1fa"} Apr 21 15:46:19.494672 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:19.494346 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-4bd9d-b569d849-26rp9" Apr 21 15:46:19.513227 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:19.513171 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-4bd9d-b569d849-26rp9" podStartSLOduration=1.131460085 podStartE2EDuration="3.513156133s" podCreationTimestamp="2026-04-21 15:46:16 +0000 UTC" firstStartedPulling="2026-04-21 15:46:16.983427414 +0000 UTC m=+654.085879013" lastFinishedPulling="2026-04-21 15:46:19.365123458 +0000 UTC m=+656.467575061" observedRunningTime="2026-04-21 15:46:19.511938208 +0000 UTC m=+656.614389831" watchObservedRunningTime="2026-04-21 15:46:19.513156133 +0000 UTC m=+656.615607753" Apr 21 15:46:25.502970 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:25.502941 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-4bd9d-b569d849-26rp9" Apr 21 15:46:26.584369 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:26.584338 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-4bd9d-b569d849-26rp9"] Apr 21 15:46:26.584777 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:26.584556 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-4bd9d-b569d849-26rp9" podUID="30869df4-7634-4299-ac1b-cbd8660110fc" containerName="model-chainer-raw-4bd9d" containerID="cri-o://1a41eb2e33adad599c7a77251c3f2951a88c3d1e3fd45ac2105f4a676051c1fa" gracePeriod=30 Apr 21 15:46:30.501523 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:30.501462 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-4bd9d-b569d849-26rp9" podUID="30869df4-7634-4299-ac1b-cbd8660110fc" containerName="model-chainer-raw-4bd9d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:46:35.501481 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:35.501439 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-4bd9d-b569d849-26rp9" podUID="30869df4-7634-4299-ac1b-cbd8660110fc" containerName="model-chainer-raw-4bd9d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:46:40.502118 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:40.502070 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-4bd9d-b569d849-26rp9" podUID="30869df4-7634-4299-ac1b-cbd8660110fc" containerName="model-chainer-raw-4bd9d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:46:40.502668 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:40.502250 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-4bd9d-b569d849-26rp9" Apr 21 15:46:45.501113 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:45.501068 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-4bd9d-b569d849-26rp9" podUID="30869df4-7634-4299-ac1b-cbd8660110fc" containerName="model-chainer-raw-4bd9d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:46:50.500824 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:50.500781 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-4bd9d-b569d849-26rp9" podUID="30869df4-7634-4299-ac1b-cbd8660110fc" containerName="model-chainer-raw-4bd9d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:46:55.501036 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:55.500994 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-4bd9d-b569d849-26rp9" podUID="30869df4-7634-4299-ac1b-cbd8660110fc" containerName="model-chainer-raw-4bd9d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:46:56.725357 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:56.725335 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-4bd9d-b569d849-26rp9" Apr 21 15:46:56.772065 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:56.772033 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30869df4-7634-4299-ac1b-cbd8660110fc-openshift-service-ca-bundle\") pod \"30869df4-7634-4299-ac1b-cbd8660110fc\" (UID: \"30869df4-7634-4299-ac1b-cbd8660110fc\") " Apr 21 15:46:56.772261 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:56.772086 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30869df4-7634-4299-ac1b-cbd8660110fc-proxy-tls\") pod \"30869df4-7634-4299-ac1b-cbd8660110fc\" (UID: \"30869df4-7634-4299-ac1b-cbd8660110fc\") " Apr 21 15:46:56.772670 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:56.772644 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30869df4-7634-4299-ac1b-cbd8660110fc-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "30869df4-7634-4299-ac1b-cbd8660110fc" (UID: "30869df4-7634-4299-ac1b-cbd8660110fc"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:46:56.774260 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:56.774227 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30869df4-7634-4299-ac1b-cbd8660110fc-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "30869df4-7634-4299-ac1b-cbd8660110fc" (UID: "30869df4-7634-4299-ac1b-cbd8660110fc"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:46:56.872734 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:56.872635 2569 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30869df4-7634-4299-ac1b-cbd8660110fc-openshift-service-ca-bundle\") on node \"ip-10-0-133-158.ec2.internal\" DevicePath \"\"" Apr 21 15:46:56.872734 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:56.872680 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30869df4-7634-4299-ac1b-cbd8660110fc-proxy-tls\") on node \"ip-10-0-133-158.ec2.internal\" DevicePath \"\"" Apr 21 15:46:57.591899 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:57.591864 2569 generic.go:358] "Generic (PLEG): container finished" podID="30869df4-7634-4299-ac1b-cbd8660110fc" containerID="1a41eb2e33adad599c7a77251c3f2951a88c3d1e3fd45ac2105f4a676051c1fa" exitCode=0 Apr 21 15:46:57.592055 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:57.591903 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-4bd9d-b569d849-26rp9" event={"ID":"30869df4-7634-4299-ac1b-cbd8660110fc","Type":"ContainerDied","Data":"1a41eb2e33adad599c7a77251c3f2951a88c3d1e3fd45ac2105f4a676051c1fa"} Apr 21 15:46:57.592055 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:57.591924 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-4bd9d-b569d849-26rp9" event={"ID":"30869df4-7634-4299-ac1b-cbd8660110fc","Type":"ContainerDied","Data":"39799f4490af8d509c10a359a6ce894ac58d3a68fd0b25788034d91de1cdce0d"} Apr 21 15:46:57.592055 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:57.591925 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-4bd9d-b569d849-26rp9" Apr 21 15:46:57.592055 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:57.591945 2569 scope.go:117] "RemoveContainer" containerID="1a41eb2e33adad599c7a77251c3f2951a88c3d1e3fd45ac2105f4a676051c1fa" Apr 21 15:46:57.599420 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:57.599399 2569 scope.go:117] "RemoveContainer" containerID="1a41eb2e33adad599c7a77251c3f2951a88c3d1e3fd45ac2105f4a676051c1fa" Apr 21 15:46:57.599658 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:46:57.599637 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a41eb2e33adad599c7a77251c3f2951a88c3d1e3fd45ac2105f4a676051c1fa\": container with ID starting with 1a41eb2e33adad599c7a77251c3f2951a88c3d1e3fd45ac2105f4a676051c1fa not found: ID does not exist" containerID="1a41eb2e33adad599c7a77251c3f2951a88c3d1e3fd45ac2105f4a676051c1fa" Apr 21 15:46:57.599703 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:57.599667 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a41eb2e33adad599c7a77251c3f2951a88c3d1e3fd45ac2105f4a676051c1fa"} err="failed to get container status \"1a41eb2e33adad599c7a77251c3f2951a88c3d1e3fd45ac2105f4a676051c1fa\": rpc error: code = NotFound desc = could not find container \"1a41eb2e33adad599c7a77251c3f2951a88c3d1e3fd45ac2105f4a676051c1fa\": container with ID starting with 1a41eb2e33adad599c7a77251c3f2951a88c3d1e3fd45ac2105f4a676051c1fa not found: ID does not exist" Apr 21 15:46:57.609435 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:57.609411 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-4bd9d-b569d849-26rp9"] Apr 21 15:46:57.615372 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:57.615351 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-4bd9d-b569d849-26rp9"] Apr 21 15:46:59.549440 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:46:59.549409 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30869df4-7634-4299-ac1b-cbd8660110fc" path="/var/lib/kubelet/pods/30869df4-7634-4299-ac1b-cbd8660110fc/volumes" Apr 21 15:48:06.820196 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:06.820160 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-3991c-bbfb6647c-5twzp"] Apr 21 15:48:06.820720 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:06.820447 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30869df4-7634-4299-ac1b-cbd8660110fc" containerName="model-chainer-raw-4bd9d" Apr 21 15:48:06.820720 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:06.820459 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="30869df4-7634-4299-ac1b-cbd8660110fc" containerName="model-chainer-raw-4bd9d" Apr 21 15:48:06.820720 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:06.820512 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="30869df4-7634-4299-ac1b-cbd8660110fc" containerName="model-chainer-raw-4bd9d" Apr 21 15:48:06.822294 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:06.822273 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3991c-bbfb6647c-5twzp" Apr 21 15:48:06.824561 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:06.824537 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 21 15:48:06.824561 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:06.824550 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-xqtsr\"" Apr 21 15:48:06.824697 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:06.824545 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-3991c-kube-rbac-proxy-sar-config\"" Apr 21 15:48:06.825290 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:06.825269 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-3991c-serving-cert\"" Apr 21 15:48:06.831554 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:06.831531 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-3991c-bbfb6647c-5twzp"] Apr 21 15:48:07.005230 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:07.005196 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3325cfb3-5281-40b4-a5c9-178e451799fc-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-3991c-bbfb6647c-5twzp\" (UID: \"3325cfb3-5281-40b4-a5c9-178e451799fc\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3991c-bbfb6647c-5twzp" Apr 21 15:48:07.005385 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:07.005254 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3325cfb3-5281-40b4-a5c9-178e451799fc-proxy-tls\") pod \"model-chainer-raw-hpa-3991c-bbfb6647c-5twzp\" (UID: \"3325cfb3-5281-40b4-a5c9-178e451799fc\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3991c-bbfb6647c-5twzp" Apr 21 15:48:07.106566 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:07.106475 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3325cfb3-5281-40b4-a5c9-178e451799fc-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-3991c-bbfb6647c-5twzp\" (UID: \"3325cfb3-5281-40b4-a5c9-178e451799fc\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3991c-bbfb6647c-5twzp" Apr 21 15:48:07.106566 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:07.106528 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3325cfb3-5281-40b4-a5c9-178e451799fc-proxy-tls\") pod \"model-chainer-raw-hpa-3991c-bbfb6647c-5twzp\" (UID: \"3325cfb3-5281-40b4-a5c9-178e451799fc\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3991c-bbfb6647c-5twzp" Apr 21 15:48:07.107114 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:07.107093 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3325cfb3-5281-40b4-a5c9-178e451799fc-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-3991c-bbfb6647c-5twzp\" (UID: \"3325cfb3-5281-40b4-a5c9-178e451799fc\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3991c-bbfb6647c-5twzp" Apr 21 15:48:07.109016 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:07.108990 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3325cfb3-5281-40b4-a5c9-178e451799fc-proxy-tls\") pod \"model-chainer-raw-hpa-3991c-bbfb6647c-5twzp\" (UID: \"3325cfb3-5281-40b4-a5c9-178e451799fc\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3991c-bbfb6647c-5twzp" Apr 21 15:48:07.132925 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:07.132895 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3991c-bbfb6647c-5twzp" Apr 21 15:48:07.251946 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:07.251914 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-3991c-bbfb6647c-5twzp"] Apr 21 15:48:07.254805 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:48:07.254774 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3325cfb3_5281_40b4_a5c9_178e451799fc.slice/crio-95fb9ab38d095504f3a84bdb16aae2881e9a47a0b932bb05039ba8c0362fd10b WatchSource:0}: Error finding container 95fb9ab38d095504f3a84bdb16aae2881e9a47a0b932bb05039ba8c0362fd10b: Status 404 returned error can't find the container with id 95fb9ab38d095504f3a84bdb16aae2881e9a47a0b932bb05039ba8c0362fd10b Apr 21 15:48:07.779907 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:07.779871 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3991c-bbfb6647c-5twzp" event={"ID":"3325cfb3-5281-40b4-a5c9-178e451799fc","Type":"ContainerStarted","Data":"1e9542bdc4b5d1223f385e9d6bb10fad2da69fba0caf87f43f6a294d19688421"} Apr 21 15:48:07.779907 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:07.779907 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3991c-bbfb6647c-5twzp" event={"ID":"3325cfb3-5281-40b4-a5c9-178e451799fc","Type":"ContainerStarted","Data":"95fb9ab38d095504f3a84bdb16aae2881e9a47a0b932bb05039ba8c0362fd10b"} Apr 21 15:48:07.780103 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:07.779963 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3991c-bbfb6647c-5twzp" Apr 21 15:48:07.798242 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:07.798194 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3991c-bbfb6647c-5twzp" podStartSLOduration=1.7981834060000002 podStartE2EDuration="1.798183406s" podCreationTimestamp="2026-04-21 15:48:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:48:07.79656383 +0000 UTC m=+764.899015451" watchObservedRunningTime="2026-04-21 15:48:07.798183406 +0000 UTC m=+764.900635027" Apr 21 15:48:13.786878 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:13.786847 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3991c-bbfb6647c-5twzp" Apr 21 15:48:16.892806 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:16.892771 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-3991c-bbfb6647c-5twzp"] Apr 21 15:48:16.893193 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:16.893005 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3991c-bbfb6647c-5twzp" podUID="3325cfb3-5281-40b4-a5c9-178e451799fc" containerName="model-chainer-raw-hpa-3991c" containerID="cri-o://1e9542bdc4b5d1223f385e9d6bb10fad2da69fba0caf87f43f6a294d19688421" gracePeriod=30 Apr 21 15:48:18.786333 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:18.786297 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3991c-bbfb6647c-5twzp" podUID="3325cfb3-5281-40b4-a5c9-178e451799fc" containerName="model-chainer-raw-hpa-3991c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:48:23.785581 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:23.785538 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3991c-bbfb6647c-5twzp" podUID="3325cfb3-5281-40b4-a5c9-178e451799fc" containerName="model-chainer-raw-hpa-3991c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:48:28.786428 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:28.786386 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3991c-bbfb6647c-5twzp" podUID="3325cfb3-5281-40b4-a5c9-178e451799fc" containerName="model-chainer-raw-hpa-3991c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:48:28.786829 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:28.786494 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3991c-bbfb6647c-5twzp" Apr 21 15:48:33.785856 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:33.785814 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3991c-bbfb6647c-5twzp" podUID="3325cfb3-5281-40b4-a5c9-178e451799fc" containerName="model-chainer-raw-hpa-3991c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:48:38.785621 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:38.785575 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3991c-bbfb6647c-5twzp" podUID="3325cfb3-5281-40b4-a5c9-178e451799fc" containerName="model-chainer-raw-hpa-3991c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:48:43.786713 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:43.786630 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3991c-bbfb6647c-5twzp" podUID="3325cfb3-5281-40b4-a5c9-178e451799fc" containerName="model-chainer-raw-hpa-3991c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:48:46.912883 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:48:46.912853 2569 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3325cfb3_5281_40b4_a5c9_178e451799fc.slice/crio-conmon-1e9542bdc4b5d1223f385e9d6bb10fad2da69fba0caf87f43f6a294d19688421.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3325cfb3_5281_40b4_a5c9_178e451799fc.slice/crio-1e9542bdc4b5d1223f385e9d6bb10fad2da69fba0caf87f43f6a294d19688421.scope\": RecentStats: unable to find data in memory cache]" Apr 21 15:48:46.913295 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:48:46.912890 2569 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3325cfb3_5281_40b4_a5c9_178e451799fc.slice/crio-conmon-1e9542bdc4b5d1223f385e9d6bb10fad2da69fba0caf87f43f6a294d19688421.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3325cfb3_5281_40b4_a5c9_178e451799fc.slice/crio-1e9542bdc4b5d1223f385e9d6bb10fad2da69fba0caf87f43f6a294d19688421.scope\": RecentStats: unable to find data in memory cache]" Apr 21 15:48:46.913295 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:48:46.912847 2569 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3325cfb3_5281_40b4_a5c9_178e451799fc.slice/crio-conmon-1e9542bdc4b5d1223f385e9d6bb10fad2da69fba0caf87f43f6a294d19688421.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3325cfb3_5281_40b4_a5c9_178e451799fc.slice/crio-1e9542bdc4b5d1223f385e9d6bb10fad2da69fba0caf87f43f6a294d19688421.scope\": RecentStats: unable to find data in memory cache]" Apr 21 15:48:47.042632 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:47.042604 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3991c-bbfb6647c-5twzp" Apr 21 15:48:47.096313 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:47.096282 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3325cfb3-5281-40b4-a5c9-178e451799fc-openshift-service-ca-bundle\") pod \"3325cfb3-5281-40b4-a5c9-178e451799fc\" (UID: \"3325cfb3-5281-40b4-a5c9-178e451799fc\") " Apr 21 15:48:47.096469 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:47.096329 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3325cfb3-5281-40b4-a5c9-178e451799fc-proxy-tls\") pod \"3325cfb3-5281-40b4-a5c9-178e451799fc\" (UID: \"3325cfb3-5281-40b4-a5c9-178e451799fc\") " Apr 21 15:48:47.096699 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:47.096669 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3325cfb3-5281-40b4-a5c9-178e451799fc-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "3325cfb3-5281-40b4-a5c9-178e451799fc" (UID: "3325cfb3-5281-40b4-a5c9-178e451799fc"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:48:47.098404 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:47.098384 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3325cfb3-5281-40b4-a5c9-178e451799fc-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3325cfb3-5281-40b4-a5c9-178e451799fc" (UID: "3325cfb3-5281-40b4-a5c9-178e451799fc"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:48:47.196951 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:47.196911 2569 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3325cfb3-5281-40b4-a5c9-178e451799fc-openshift-service-ca-bundle\") on node \"ip-10-0-133-158.ec2.internal\" DevicePath \"\"" Apr 21 15:48:47.196951 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:47.196951 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3325cfb3-5281-40b4-a5c9-178e451799fc-proxy-tls\") on node \"ip-10-0-133-158.ec2.internal\" DevicePath \"\"" Apr 21 15:48:47.882246 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:47.882164 2569 generic.go:358] "Generic (PLEG): container finished" podID="3325cfb3-5281-40b4-a5c9-178e451799fc" containerID="1e9542bdc4b5d1223f385e9d6bb10fad2da69fba0caf87f43f6a294d19688421" exitCode=0 Apr 21 15:48:47.882246 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:47.882233 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3991c-bbfb6647c-5twzp" Apr 21 15:48:47.882246 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:47.882235 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3991c-bbfb6647c-5twzp" event={"ID":"3325cfb3-5281-40b4-a5c9-178e451799fc","Type":"ContainerDied","Data":"1e9542bdc4b5d1223f385e9d6bb10fad2da69fba0caf87f43f6a294d19688421"} Apr 21 15:48:47.882488 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:47.882268 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3991c-bbfb6647c-5twzp" event={"ID":"3325cfb3-5281-40b4-a5c9-178e451799fc","Type":"ContainerDied","Data":"95fb9ab38d095504f3a84bdb16aae2881e9a47a0b932bb05039ba8c0362fd10b"} Apr 21 15:48:47.882488 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:47.882288 2569 scope.go:117] "RemoveContainer" containerID="1e9542bdc4b5d1223f385e9d6bb10fad2da69fba0caf87f43f6a294d19688421" Apr 21 15:48:47.890162 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:47.890143 2569 scope.go:117] "RemoveContainer" containerID="1e9542bdc4b5d1223f385e9d6bb10fad2da69fba0caf87f43f6a294d19688421" Apr 21 15:48:47.890401 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:48:47.890383 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e9542bdc4b5d1223f385e9d6bb10fad2da69fba0caf87f43f6a294d19688421\": container with ID starting with 1e9542bdc4b5d1223f385e9d6bb10fad2da69fba0caf87f43f6a294d19688421 not found: ID does not exist" containerID="1e9542bdc4b5d1223f385e9d6bb10fad2da69fba0caf87f43f6a294d19688421" Apr 21 15:48:47.890460 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:47.890413 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e9542bdc4b5d1223f385e9d6bb10fad2da69fba0caf87f43f6a294d19688421"} err="failed to get container status \"1e9542bdc4b5d1223f385e9d6bb10fad2da69fba0caf87f43f6a294d19688421\": rpc error: code = NotFound desc = could not find container \"1e9542bdc4b5d1223f385e9d6bb10fad2da69fba0caf87f43f6a294d19688421\": container with ID starting with 1e9542bdc4b5d1223f385e9d6bb10fad2da69fba0caf87f43f6a294d19688421 not found: ID does not exist" Apr 21 15:48:47.899652 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:47.899629 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-3991c-bbfb6647c-5twzp"] Apr 21 15:48:47.909060 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:47.909040 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-3991c-bbfb6647c-5twzp"] Apr 21 15:48:49.545983 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:48:49.545938 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3325cfb3-5281-40b4-a5c9-178e451799fc" path="/var/lib/kubelet/pods/3325cfb3-5281-40b4-a5c9-178e451799fc/volumes" Apr 21 15:57:03.245292 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:03.245258 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hklb5/must-gather-bdxk9"] Apr 21 15:57:03.245753 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:03.245552 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3325cfb3-5281-40b4-a5c9-178e451799fc" containerName="model-chainer-raw-hpa-3991c" Apr 21 15:57:03.245753 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:03.245562 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="3325cfb3-5281-40b4-a5c9-178e451799fc" containerName="model-chainer-raw-hpa-3991c" Apr 21 15:57:03.245753 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:03.245617 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="3325cfb3-5281-40b4-a5c9-178e451799fc" containerName="model-chainer-raw-hpa-3991c" Apr 21 15:57:03.248444 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:03.248425 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hklb5/must-gather-bdxk9" Apr 21 15:57:03.250999 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:03.250974 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-hklb5\"/\"kube-root-ca.crt\"" Apr 21 15:57:03.251149 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:03.250974 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-hklb5\"/\"openshift-service-ca.crt\"" Apr 21 15:57:03.251812 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:03.251794 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-hklb5\"/\"default-dockercfg-xqks4\"" Apr 21 15:57:03.256027 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:03.256009 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hklb5/must-gather-bdxk9"] Apr 21 15:57:03.276273 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:03.276235 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tnbn\" (UniqueName: \"kubernetes.io/projected/98486720-59b9-4ca3-87ea-4126495e7648-kube-api-access-5tnbn\") pod \"must-gather-bdxk9\" (UID: \"98486720-59b9-4ca3-87ea-4126495e7648\") " pod="openshift-must-gather-hklb5/must-gather-bdxk9" Apr 21 15:57:03.276445 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:03.276299 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/98486720-59b9-4ca3-87ea-4126495e7648-must-gather-output\") pod \"must-gather-bdxk9\" (UID: \"98486720-59b9-4ca3-87ea-4126495e7648\") " pod="openshift-must-gather-hklb5/must-gather-bdxk9" Apr 21 15:57:03.377400 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:03.377362 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5tnbn\" (UniqueName: \"kubernetes.io/projected/98486720-59b9-4ca3-87ea-4126495e7648-kube-api-access-5tnbn\") pod \"must-gather-bdxk9\" (UID: \"98486720-59b9-4ca3-87ea-4126495e7648\") " pod="openshift-must-gather-hklb5/must-gather-bdxk9" Apr 21 15:57:03.377566 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:03.377418 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/98486720-59b9-4ca3-87ea-4126495e7648-must-gather-output\") pod \"must-gather-bdxk9\" (UID: \"98486720-59b9-4ca3-87ea-4126495e7648\") " pod="openshift-must-gather-hklb5/must-gather-bdxk9" Apr 21 15:57:03.377701 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:03.377685 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/98486720-59b9-4ca3-87ea-4126495e7648-must-gather-output\") pod \"must-gather-bdxk9\" (UID: \"98486720-59b9-4ca3-87ea-4126495e7648\") " pod="openshift-must-gather-hklb5/must-gather-bdxk9" Apr 21 15:57:03.386662 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:03.386625 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tnbn\" (UniqueName: \"kubernetes.io/projected/98486720-59b9-4ca3-87ea-4126495e7648-kube-api-access-5tnbn\") pod \"must-gather-bdxk9\" (UID: \"98486720-59b9-4ca3-87ea-4126495e7648\") " pod="openshift-must-gather-hklb5/must-gather-bdxk9" Apr 21 15:57:03.558320 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:03.558228 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hklb5/must-gather-bdxk9" Apr 21 15:57:03.679538 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:03.679503 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hklb5/must-gather-bdxk9"] Apr 21 15:57:03.682563 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:57:03.682534 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98486720_59b9_4ca3_87ea_4126495e7648.slice/crio-e0c9c6a11cc08cf59f023e33019a453932cd19415a1f2ddcf34aba694fd19082 WatchSource:0}: Error finding container e0c9c6a11cc08cf59f023e33019a453932cd19415a1f2ddcf34aba694fd19082: Status 404 returned error can't find the container with id e0c9c6a11cc08cf59f023e33019a453932cd19415a1f2ddcf34aba694fd19082 Apr 21 15:57:03.684154 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:03.684119 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:57:04.191764 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:04.191729 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hklb5/must-gather-bdxk9" event={"ID":"98486720-59b9-4ca3-87ea-4126495e7648","Type":"ContainerStarted","Data":"e0c9c6a11cc08cf59f023e33019a453932cd19415a1f2ddcf34aba694fd19082"} Apr 21 15:57:10.211491 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:10.211451 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hklb5/must-gather-bdxk9" event={"ID":"98486720-59b9-4ca3-87ea-4126495e7648","Type":"ContainerStarted","Data":"c4b07a9dcf604f7ba1c45b7f585587d41973406b19a0ab37e8e103ba60393e60"} Apr 21 15:57:10.211491 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:10.211491 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hklb5/must-gather-bdxk9" event={"ID":"98486720-59b9-4ca3-87ea-4126495e7648","Type":"ContainerStarted","Data":"a11861696f90db2695117abbe0ed7baff5e7fd613864752ea61cf3a5f02496e5"} Apr 21 15:57:10.230336 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:10.230273 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hklb5/must-gather-bdxk9" podStartSLOduration=1.343039102 podStartE2EDuration="7.230253083s" podCreationTimestamp="2026-04-21 15:57:03 +0000 UTC" firstStartedPulling="2026-04-21 15:57:03.684259453 +0000 UTC m=+1300.786711052" lastFinishedPulling="2026-04-21 15:57:09.571473429 +0000 UTC m=+1306.673925033" observedRunningTime="2026-04-21 15:57:10.229693044 +0000 UTC m=+1307.332144678" watchObservedRunningTime="2026-04-21 15:57:10.230253083 +0000 UTC m=+1307.332704706" Apr 21 15:57:26.258470 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:26.258389 2569 generic.go:358] "Generic (PLEG): container finished" podID="98486720-59b9-4ca3-87ea-4126495e7648" containerID="a11861696f90db2695117abbe0ed7baff5e7fd613864752ea61cf3a5f02496e5" exitCode=0 Apr 21 15:57:26.258849 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:26.258464 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hklb5/must-gather-bdxk9" event={"ID":"98486720-59b9-4ca3-87ea-4126495e7648","Type":"ContainerDied","Data":"a11861696f90db2695117abbe0ed7baff5e7fd613864752ea61cf3a5f02496e5"} Apr 21 15:57:26.258849 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:26.258786 2569 scope.go:117] "RemoveContainer" containerID="a11861696f90db2695117abbe0ed7baff5e7fd613864752ea61cf3a5f02496e5" Apr 21 15:57:27.003668 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:27.003630 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hklb5_must-gather-bdxk9_98486720-59b9-4ca3-87ea-4126495e7648/gather/0.log" Apr 21 15:57:30.464026 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:30.463997 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-2p6kz_08cd6fc2-7059-4977-908e-bd91ccd03110/global-pull-secret-syncer/0.log" Apr 21 15:57:30.735086 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:30.735008 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-zptw6_6423248e-635c-46da-980c-a175f07c835b/konnectivity-agent/0.log" Apr 21 15:57:30.760235 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:30.760203 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-158.ec2.internal_35017106fd99d25a848bf349caf9d842/haproxy/0.log" Apr 21 15:57:32.412460 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:32.412416 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hklb5/must-gather-bdxk9"] Apr 21 15:57:32.414828 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:32.412715 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-hklb5/must-gather-bdxk9" podUID="98486720-59b9-4ca3-87ea-4126495e7648" containerName="copy" containerID="cri-o://c4b07a9dcf604f7ba1c45b7f585587d41973406b19a0ab37e8e103ba60393e60" gracePeriod=2 Apr 21 15:57:32.419309 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:32.419264 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hklb5/must-gather-bdxk9"] Apr 21 15:57:32.639269 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:32.639243 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hklb5_must-gather-bdxk9_98486720-59b9-4ca3-87ea-4126495e7648/copy/0.log" Apr 21 15:57:32.639567 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:32.639551 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hklb5/must-gather-bdxk9" Apr 21 15:57:32.642040 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:32.642016 2569 status_manager.go:895] "Failed to get status for pod" podUID="98486720-59b9-4ca3-87ea-4126495e7648" pod="openshift-must-gather-hklb5/must-gather-bdxk9" err="pods \"must-gather-bdxk9\" is forbidden: User \"system:node:ip-10-0-133-158.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-hklb5\": no relationship found between node 'ip-10-0-133-158.ec2.internal' and this object" Apr 21 15:57:32.724340 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:32.724295 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tnbn\" (UniqueName: \"kubernetes.io/projected/98486720-59b9-4ca3-87ea-4126495e7648-kube-api-access-5tnbn\") pod \"98486720-59b9-4ca3-87ea-4126495e7648\" (UID: \"98486720-59b9-4ca3-87ea-4126495e7648\") " Apr 21 15:57:32.724495 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:32.724455 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/98486720-59b9-4ca3-87ea-4126495e7648-must-gather-output\") pod \"98486720-59b9-4ca3-87ea-4126495e7648\" (UID: \"98486720-59b9-4ca3-87ea-4126495e7648\") " Apr 21 15:57:32.725726 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:32.725693 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98486720-59b9-4ca3-87ea-4126495e7648-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "98486720-59b9-4ca3-87ea-4126495e7648" (UID: "98486720-59b9-4ca3-87ea-4126495e7648"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:57:32.726385 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:32.726361 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98486720-59b9-4ca3-87ea-4126495e7648-kube-api-access-5tnbn" (OuterVolumeSpecName: "kube-api-access-5tnbn") pod "98486720-59b9-4ca3-87ea-4126495e7648" (UID: "98486720-59b9-4ca3-87ea-4126495e7648"). InnerVolumeSpecName "kube-api-access-5tnbn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:57:32.825361 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:32.825329 2569 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/98486720-59b9-4ca3-87ea-4126495e7648-must-gather-output\") on node \"ip-10-0-133-158.ec2.internal\" DevicePath \"\"" Apr 21 15:57:32.825361 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:32.825357 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5tnbn\" (UniqueName: \"kubernetes.io/projected/98486720-59b9-4ca3-87ea-4126495e7648-kube-api-access-5tnbn\") on node \"ip-10-0-133-158.ec2.internal\" DevicePath \"\"" Apr 21 15:57:33.281820 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:33.281789 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hklb5_must-gather-bdxk9_98486720-59b9-4ca3-87ea-4126495e7648/copy/0.log" Apr 21 15:57:33.282148 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:33.282106 2569 generic.go:358] "Generic (PLEG): container finished" podID="98486720-59b9-4ca3-87ea-4126495e7648" containerID="c4b07a9dcf604f7ba1c45b7f585587d41973406b19a0ab37e8e103ba60393e60" exitCode=143 Apr 21 15:57:33.282230 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:33.282178 2569 scope.go:117] "RemoveContainer" containerID="c4b07a9dcf604f7ba1c45b7f585587d41973406b19a0ab37e8e103ba60393e60" Apr 21 15:57:33.282230 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:33.282188 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hklb5/must-gather-bdxk9" Apr 21 15:57:33.284971 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:33.284942 2569 status_manager.go:895] "Failed to get status for pod" podUID="98486720-59b9-4ca3-87ea-4126495e7648" pod="openshift-must-gather-hklb5/must-gather-bdxk9" err="pods \"must-gather-bdxk9\" is forbidden: User \"system:node:ip-10-0-133-158.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-hklb5\": no relationship found between node 'ip-10-0-133-158.ec2.internal' and this object" Apr 21 15:57:33.289294 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:33.289270 2569 scope.go:117] "RemoveContainer" containerID="a11861696f90db2695117abbe0ed7baff5e7fd613864752ea61cf3a5f02496e5" Apr 21 15:57:33.292866 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:33.292839 2569 status_manager.go:895] "Failed to get status for pod" podUID="98486720-59b9-4ca3-87ea-4126495e7648" pod="openshift-must-gather-hklb5/must-gather-bdxk9" err="pods \"must-gather-bdxk9\" is forbidden: User \"system:node:ip-10-0-133-158.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-hklb5\": no relationship found between node 'ip-10-0-133-158.ec2.internal' and this object" Apr 21 15:57:33.301007 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:33.300991 2569 scope.go:117] "RemoveContainer" containerID="c4b07a9dcf604f7ba1c45b7f585587d41973406b19a0ab37e8e103ba60393e60" Apr 21 15:57:33.301272 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:57:33.301253 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4b07a9dcf604f7ba1c45b7f585587d41973406b19a0ab37e8e103ba60393e60\": container with ID starting with c4b07a9dcf604f7ba1c45b7f585587d41973406b19a0ab37e8e103ba60393e60 not found: ID does not exist" containerID="c4b07a9dcf604f7ba1c45b7f585587d41973406b19a0ab37e8e103ba60393e60" Apr 21 15:57:33.301320 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:33.301281 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4b07a9dcf604f7ba1c45b7f585587d41973406b19a0ab37e8e103ba60393e60"} err="failed to get container status \"c4b07a9dcf604f7ba1c45b7f585587d41973406b19a0ab37e8e103ba60393e60\": rpc error: code = NotFound desc = could not find container \"c4b07a9dcf604f7ba1c45b7f585587d41973406b19a0ab37e8e103ba60393e60\": container with ID starting with c4b07a9dcf604f7ba1c45b7f585587d41973406b19a0ab37e8e103ba60393e60 not found: ID does not exist" Apr 21 15:57:33.301320 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:33.301298 2569 scope.go:117] "RemoveContainer" containerID="a11861696f90db2695117abbe0ed7baff5e7fd613864752ea61cf3a5f02496e5" Apr 21 15:57:33.301542 ip-10-0-133-158 kubenswrapper[2569]: E0421 15:57:33.301524 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a11861696f90db2695117abbe0ed7baff5e7fd613864752ea61cf3a5f02496e5\": container with ID starting with a11861696f90db2695117abbe0ed7baff5e7fd613864752ea61cf3a5f02496e5 not found: ID does not exist" containerID="a11861696f90db2695117abbe0ed7baff5e7fd613864752ea61cf3a5f02496e5" Apr 21 15:57:33.301584 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:33.301549 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a11861696f90db2695117abbe0ed7baff5e7fd613864752ea61cf3a5f02496e5"} err="failed to get container status \"a11861696f90db2695117abbe0ed7baff5e7fd613864752ea61cf3a5f02496e5\": rpc error: code = NotFound desc = could not find container \"a11861696f90db2695117abbe0ed7baff5e7fd613864752ea61cf3a5f02496e5\": container with ID starting with a11861696f90db2695117abbe0ed7baff5e7fd613864752ea61cf3a5f02496e5 not found: ID does not exist" Apr 21 15:57:33.546076 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:33.546001 2569 status_manager.go:895] "Failed to get status for pod" podUID="98486720-59b9-4ca3-87ea-4126495e7648" pod="openshift-must-gather-hklb5/must-gather-bdxk9" err="pods \"must-gather-bdxk9\" is forbidden: User \"system:node:ip-10-0-133-158.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-hklb5\": no relationship found between node 'ip-10-0-133-158.ec2.internal' and this object" Apr 21 15:57:33.546076 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:33.546016 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98486720-59b9-4ca3-87ea-4126495e7648" path="/var/lib/kubelet/pods/98486720-59b9-4ca3-87ea-4126495e7648/volumes" Apr 21 15:57:34.101330 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:34.101302 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-8b68b5dff-cfnl7_23a4ea67-b844-43a1-bd74-fb2d6787d688/metrics-server/0.log" Apr 21 15:57:34.141023 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:34.140988 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-wtsst_0f967e80-450d-4360-9a92-a5407235a3a9/monitoring-plugin/0.log" Apr 21 15:57:34.203944 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:34.203916 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pjqcl_ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58/node-exporter/0.log" Apr 21 15:57:34.242937 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:34.242908 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pjqcl_ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58/kube-rbac-proxy/0.log" Apr 21 15:57:34.303471 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:34.303447 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pjqcl_ed1e9dc2-5dbd-4b3a-ab08-e2f917d2bd58/init-textfile/0.log" Apr 21 15:57:34.724442 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:34.724400 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-sgdpn_2bc24c10-a6e7-48e8-a178-dbc7f52c7d59/kube-rbac-proxy-main/0.log" Apr 21 15:57:34.794757 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:34.794723 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-sgdpn_2bc24c10-a6e7-48e8-a178-dbc7f52c7d59/kube-rbac-proxy-self/0.log" Apr 21 15:57:34.877645 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:34.877618 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-sgdpn_2bc24c10-a6e7-48e8-a178-dbc7f52c7d59/openshift-state-metrics/0.log" Apr 21 15:57:36.953665 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:36.953625 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7pvkx/perf-node-gather-daemonset-rc5gw"] Apr 21 15:57:36.954164 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:36.954037 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="98486720-59b9-4ca3-87ea-4126495e7648" containerName="gather" Apr 21 15:57:36.954164 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:36.954057 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="98486720-59b9-4ca3-87ea-4126495e7648" containerName="gather" Apr 21 15:57:36.954164 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:36.954086 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="98486720-59b9-4ca3-87ea-4126495e7648" containerName="copy" Apr 21 15:57:36.954164 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:36.954094 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="98486720-59b9-4ca3-87ea-4126495e7648" containerName="copy" Apr 21 15:57:36.954378 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:36.954182 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="98486720-59b9-4ca3-87ea-4126495e7648" containerName="gather" Apr 21 15:57:36.954378 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:36.954193 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="98486720-59b9-4ca3-87ea-4126495e7648" containerName="copy" Apr 21 15:57:36.959375 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:36.959351 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rc5gw" Apr 21 15:57:36.961634 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:36.961611 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7pvkx\"/\"kube-root-ca.crt\"" Apr 21 15:57:36.962506 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:36.962489 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-7pvkx\"/\"default-dockercfg-mgs4b\"" Apr 21 15:57:36.962555 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:36.962489 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7pvkx\"/\"openshift-service-ca.crt\"" Apr 21 15:57:36.970413 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:36.970392 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7pvkx/perf-node-gather-daemonset-rc5gw"] Apr 21 15:57:37.052745 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:37.052712 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/95f4f634-8067-400d-ad58-b207e60b80b4-sys\") pod \"perf-node-gather-daemonset-rc5gw\" (UID: \"95f4f634-8067-400d-ad58-b207e60b80b4\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rc5gw" Apr 21 15:57:37.052868 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:37.052783 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/95f4f634-8067-400d-ad58-b207e60b80b4-podres\") pod \"perf-node-gather-daemonset-rc5gw\" (UID: \"95f4f634-8067-400d-ad58-b207e60b80b4\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rc5gw" Apr 21 15:57:37.052868 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:37.052830 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8ttv\" (UniqueName: \"kubernetes.io/projected/95f4f634-8067-400d-ad58-b207e60b80b4-kube-api-access-t8ttv\") pod \"perf-node-gather-daemonset-rc5gw\" (UID: \"95f4f634-8067-400d-ad58-b207e60b80b4\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rc5gw" Apr 21 15:57:37.052868 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:37.052852 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/95f4f634-8067-400d-ad58-b207e60b80b4-proc\") pod \"perf-node-gather-daemonset-rc5gw\" (UID: \"95f4f634-8067-400d-ad58-b207e60b80b4\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rc5gw" Apr 21 15:57:37.052992 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:37.052901 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/95f4f634-8067-400d-ad58-b207e60b80b4-lib-modules\") pod \"perf-node-gather-daemonset-rc5gw\" (UID: \"95f4f634-8067-400d-ad58-b207e60b80b4\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rc5gw" Apr 21 15:57:37.153951 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:37.153919 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/95f4f634-8067-400d-ad58-b207e60b80b4-podres\") pod \"perf-node-gather-daemonset-rc5gw\" (UID: \"95f4f634-8067-400d-ad58-b207e60b80b4\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rc5gw" Apr 21 15:57:37.154073 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:37.153966 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t8ttv\" (UniqueName: \"kubernetes.io/projected/95f4f634-8067-400d-ad58-b207e60b80b4-kube-api-access-t8ttv\") pod \"perf-node-gather-daemonset-rc5gw\" (UID: \"95f4f634-8067-400d-ad58-b207e60b80b4\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rc5gw" Apr 21 15:57:37.154113 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:37.154082 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/95f4f634-8067-400d-ad58-b207e60b80b4-proc\") pod \"perf-node-gather-daemonset-rc5gw\" (UID: \"95f4f634-8067-400d-ad58-b207e60b80b4\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rc5gw" Apr 21 15:57:37.154177 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:37.154112 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/95f4f634-8067-400d-ad58-b207e60b80b4-podres\") pod \"perf-node-gather-daemonset-rc5gw\" (UID: \"95f4f634-8067-400d-ad58-b207e60b80b4\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rc5gw" Apr 21 15:57:37.154177 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:37.154118 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/95f4f634-8067-400d-ad58-b207e60b80b4-lib-modules\") pod \"perf-node-gather-daemonset-rc5gw\" (UID: \"95f4f634-8067-400d-ad58-b207e60b80b4\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rc5gw" Apr 21 15:57:37.154248 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:37.154198 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/95f4f634-8067-400d-ad58-b207e60b80b4-proc\") pod \"perf-node-gather-daemonset-rc5gw\" (UID: \"95f4f634-8067-400d-ad58-b207e60b80b4\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rc5gw" Apr 21 15:57:37.154248 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:37.154205 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/95f4f634-8067-400d-ad58-b207e60b80b4-sys\") pod \"perf-node-gather-daemonset-rc5gw\" (UID: \"95f4f634-8067-400d-ad58-b207e60b80b4\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rc5gw" Apr 21 15:57:37.154248 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:37.154238 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/95f4f634-8067-400d-ad58-b207e60b80b4-lib-modules\") pod \"perf-node-gather-daemonset-rc5gw\" (UID: \"95f4f634-8067-400d-ad58-b207e60b80b4\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rc5gw" Apr 21 15:57:37.154337 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:37.154274 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/95f4f634-8067-400d-ad58-b207e60b80b4-sys\") pod \"perf-node-gather-daemonset-rc5gw\" (UID: \"95f4f634-8067-400d-ad58-b207e60b80b4\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rc5gw" Apr 21 15:57:37.162687 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:37.162655 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8ttv\" (UniqueName: \"kubernetes.io/projected/95f4f634-8067-400d-ad58-b207e60b80b4-kube-api-access-t8ttv\") pod \"perf-node-gather-daemonset-rc5gw\" (UID: \"95f4f634-8067-400d-ad58-b207e60b80b4\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rc5gw" Apr 21 15:57:37.268356 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:37.268259 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rc5gw" Apr 21 15:57:37.385899 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:37.385869 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7pvkx/perf-node-gather-daemonset-rc5gw"] Apr 21 15:57:37.389199 ip-10-0-133-158 kubenswrapper[2569]: W0421 15:57:37.389170 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod95f4f634_8067_400d_ad58_b207e60b80b4.slice/crio-d646a42b0deeada03e8ba42b16e0b6d6c7dd76af2ab94cc89055992ff91da966 WatchSource:0}: Error finding container d646a42b0deeada03e8ba42b16e0b6d6c7dd76af2ab94cc89055992ff91da966: Status 404 returned error can't find the container with id d646a42b0deeada03e8ba42b16e0b6d6c7dd76af2ab94cc89055992ff91da966 Apr 21 15:57:38.225237 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:38.225204 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-hdmgw_f12a02e9-41ef-4e4b-913a-8246dfcb282b/download-server/0.log" Apr 21 15:57:38.298817 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:38.298781 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rc5gw" event={"ID":"95f4f634-8067-400d-ad58-b207e60b80b4","Type":"ContainerStarted","Data":"1c2c179143300283eb131bd954980d4cde18029c419900612a91944f59266ad6"} Apr 21 15:57:38.298817 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:38.298824 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rc5gw" event={"ID":"95f4f634-8067-400d-ad58-b207e60b80b4","Type":"ContainerStarted","Data":"d646a42b0deeada03e8ba42b16e0b6d6c7dd76af2ab94cc89055992ff91da966"} Apr 21 15:57:38.299051 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:38.298915 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rc5gw" Apr 21 15:57:39.852468 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:39.852436 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-sg5b7_60d46684-622f-4c21-be81-9f138a88507d/dns/0.log" Apr 21 15:57:39.907803 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:39.907766 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-sg5b7_60d46684-622f-4c21-be81-9f138a88507d/kube-rbac-proxy/0.log" Apr 21 15:57:39.984603 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:39.984572 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-f7s8m_5199de06-2148-4da2-80d1-c514e73d93fb/dns-node-resolver/0.log" Apr 21 15:57:40.619039 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:40.619005 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-789665c8f5-2khmb_94ced3d3-36cb-4c6f-8871-7381aaf033c2/registry/0.log" Apr 21 15:57:40.661892 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:40.661863 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-94bl8_0187917a-3b91-4173-9632-8211e2adc77e/node-ca/0.log" Apr 21 15:57:41.647690 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:41.647664 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-58d658548d-g2475_746f5baf-fbf8-4ff6-8fb9-0a3263bd7dc6/router/0.log" Apr 21 15:57:42.119776 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:42.119701 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-rhcxk_f1a9389d-bece-4871-8a89-c3af3238f617/serve-healthcheck-canary/0.log" Apr 21 15:57:42.715276 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:42.715249 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jscgp_a429572e-f646-4624-aaee-489752ccaffb/kube-rbac-proxy/0.log" Apr 21 15:57:42.746319 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:42.746297 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jscgp_a429572e-f646-4624-aaee-489752ccaffb/exporter/0.log" Apr 21 15:57:42.773832 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:42.773809 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jscgp_a429572e-f646-4624-aaee-489752ccaffb/extractor/0.log" Apr 21 15:57:44.311170 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:44.311116 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rc5gw" Apr 21 15:57:44.334932 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:44.334874 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rc5gw" podStartSLOduration=8.334860064 podStartE2EDuration="8.334860064s" podCreationTimestamp="2026-04-21 15:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:57:38.32971584 +0000 UTC m=+1335.432167463" watchObservedRunningTime="2026-04-21 15:57:44.334860064 +0000 UTC m=+1341.437311686" Apr 21 15:57:44.896023 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:44.895990 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-9c85dd4d8-5f58h_3ae42f61-7b5a-4286-a54e-03b6f3fb89f3/manager/0.log" Apr 21 15:57:44.944929 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:44.944893 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-6b94ff949c-vtgsb_797464d7-ee23-4c4d-908d-ba863a1e2e8f/manager/0.log" Apr 21 15:57:51.489058 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:51.489023 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-84m4w_c11e8a8b-9ef5-4ed8-a05e-9e1d9ad466d2/kube-multus/0.log" Apr 21 15:57:52.070026 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:52.069996 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ll86t_8e625158-0ce1-4766-9988-80be7fb8ed12/kube-multus-additional-cni-plugins/0.log" Apr 21 15:57:52.101478 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:52.101446 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ll86t_8e625158-0ce1-4766-9988-80be7fb8ed12/egress-router-binary-copy/0.log" Apr 21 15:57:52.127151 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:52.127111 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ll86t_8e625158-0ce1-4766-9988-80be7fb8ed12/cni-plugins/0.log" Apr 21 15:57:52.152351 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:52.152328 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ll86t_8e625158-0ce1-4766-9988-80be7fb8ed12/bond-cni-plugin/0.log" Apr 21 15:57:52.176276 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:52.176251 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ll86t_8e625158-0ce1-4766-9988-80be7fb8ed12/routeoverride-cni/0.log" Apr 21 15:57:52.203532 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:52.203506 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ll86t_8e625158-0ce1-4766-9988-80be7fb8ed12/whereabouts-cni-bincopy/0.log" Apr 21 15:57:52.230349 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:52.230325 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ll86t_8e625158-0ce1-4766-9988-80be7fb8ed12/whereabouts-cni/0.log" Apr 21 15:57:52.291743 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:52.291691 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-8tknf_0e013988-1283-4f21-89bb-0200deb14502/network-metrics-daemon/0.log" Apr 21 15:57:52.323996 ip-10-0-133-158 kubenswrapper[2569]: I0421 15:57:52.323916 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-8tknf_0e013988-1283-4f21-89bb-0200deb14502/kube-rbac-proxy/0.log"