Apr 17 11:13:55.152870 ip-10-0-136-177 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 11:13:55.152883 ip-10-0-136-177 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 11:13:55.152893 ip-10-0-136-177 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 11:13:55.153229 ip-10-0-136-177 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 11:14:05.326013 ip-10-0-136-177 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 11:14:05.326026 ip-10-0-136-177 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot b1398ba580d5495da75b808c0f955580 -- Apr 17 11:16:30.924161 ip-10-0-136-177 systemd[1]: Starting Kubernetes Kubelet... Apr 17 11:16:31.374200 ip-10-0-136-177 kubenswrapper[2581]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:16:31.374200 ip-10-0-136-177 kubenswrapper[2581]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 11:16:31.374200 ip-10-0-136-177 kubenswrapper[2581]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:16:31.374200 ip-10-0-136-177 kubenswrapper[2581]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 11:16:31.374200 ip-10-0-136-177 kubenswrapper[2581]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:16:31.374200 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.373960 2581 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 11:16:31.507193 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380052 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:31.507193 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380067 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:31.507193 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380071 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:31.507193 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380074 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:31.507193 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380079 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:31.507193 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380082 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:31.507193 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380085 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:31.507193 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380088 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:31.507193 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380091 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:31.507193 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380094 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:31.507193 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380097 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:31.507193 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380100 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:31.507193 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380103 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:31.507193 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380105 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:31.507193 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380108 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:31.507193 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380111 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:31.507193 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380113 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:31.507193 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380116 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:31.507193 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380118 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:31.507193 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380121 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:31.482543 ip-10-0-136-177 systemd[1]: Started Kubernetes Kubelet. Apr 17 11:16:31.509301 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380123 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:31.509301 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380126 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:31.509301 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380129 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:31.509301 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380131 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:31.509301 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380133 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:31.509301 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380138 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:31.509301 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380142 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:31.509301 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380145 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:31.509301 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380147 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:31.509301 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380150 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:31.509301 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380160 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:31.509301 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380163 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:31.509301 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380166 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:31.509301 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380169 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:31.509301 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380173 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:31.509301 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380177 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:31.509301 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380180 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:31.509301 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380183 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:31.509301 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380186 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:31.510176 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380189 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:31.510176 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380191 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:31.510176 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380194 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:31.510176 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380196 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:31.510176 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380199 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:31.510176 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380201 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:31.510176 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380205 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:31.510176 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380207 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:31.510176 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380210 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:31.510176 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380213 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:31.510176 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380215 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:31.510176 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380218 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:31.510176 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380221 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:31.510176 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380223 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:31.510176 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380226 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:31.510176 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380229 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:31.510176 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380231 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:31.510176 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380234 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:31.510176 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380236 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:31.510176 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380239 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:31.691662 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380241 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:31.691662 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380244 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:31.691662 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380247 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:31.691662 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380251 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:31.691662 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380253 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:31.691662 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380256 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:31.691662 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380259 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:31.691662 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380262 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:31.691662 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380264 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:31.691662 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380267 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:31.691662 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380269 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:31.691662 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380272 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:31.691662 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380275 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:31.691662 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380277 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:31.691662 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380280 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:31.691662 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380282 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:31.691662 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380285 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:31.691662 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380288 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:31.691662 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380290 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:31.691662 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380294 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:31.705667 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380297 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:31.705667 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380299 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:31.705667 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380302 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:31.705667 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380305 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:31.705667 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380307 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:31.705667 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380310 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:31.705667 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380313 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:31.705667 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380715 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:31.705667 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380720 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:31.705667 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380723 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:31.705667 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380726 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:31.705667 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380729 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:31.705667 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380731 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:31.705667 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380734 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:31.705667 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380736 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:31.705667 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380739 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:31.705667 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380742 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:31.705667 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380745 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:31.705667 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380749 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:31.705667 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380751 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:31.706467 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380754 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:31.706467 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380757 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:31.706467 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380760 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:31.706467 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380762 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:31.706467 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380766 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:31.706467 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380768 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:31.706467 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380771 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:31.706467 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380774 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:31.706467 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380776 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:31.706467 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380779 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:31.706467 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380782 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:31.706467 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380785 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:31.706467 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380789 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:31.706467 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380791 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:31.706467 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380794 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:31.706467 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380796 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:31.706467 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380799 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:31.706467 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380801 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:31.706467 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380804 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:31.706467 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380806 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:31.707222 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380809 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:31.707222 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380812 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:31.707222 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380814 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:31.707222 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380836 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:31.707222 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380840 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:31.707222 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380844 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:31.707222 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380848 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:31.707222 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380852 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:31.707222 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380855 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:31.707222 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380858 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:31.707222 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380861 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:31.707222 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380864 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:31.707222 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380867 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:31.707222 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380869 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:31.707222 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380872 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:31.707222 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380881 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:31.707222 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380886 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:31.707222 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380889 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:31.707222 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380892 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:31.707222 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380895 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:31.707838 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380898 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:31.707838 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380900 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:31.707838 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380903 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:31.707838 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380906 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:31.707838 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380909 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:31.707838 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380912 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:31.707838 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380915 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:31.707838 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380918 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:31.707838 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380921 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:31.707838 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380923 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:31.707838 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380926 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:31.707838 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380928 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:31.707838 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380931 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:31.707838 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380933 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:31.707838 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380936 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:31.707838 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380938 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:31.707838 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380941 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:31.707838 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380943 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:31.707838 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380946 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:31.707838 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380948 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:31.708493 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380951 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:31.708493 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380953 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:31.708493 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380956 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:31.708493 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380958 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:31.708493 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380961 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:31.708493 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380964 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:31.708493 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380966 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:31.708493 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380969 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:31.708493 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380971 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:31.708493 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380974 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:31.708493 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380978 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:31.708493 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380980 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:31.708493 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.380983 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:31.708493 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381059 2581 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 11:16:31.708493 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381066 2581 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 11:16:31.708493 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381073 2581 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 11:16:31.708493 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381078 2581 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 11:16:31.708493 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381083 2581 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 11:16:31.708493 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381087 2581 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 11:16:31.708493 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381096 2581 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 11:16:31.708493 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381100 2581 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 11:16:31.709171 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381104 2581 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 11:16:31.709171 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381107 2581 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 11:16:31.709171 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381110 2581 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 11:16:31.709171 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381113 2581 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 11:16:31.709171 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381116 2581 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 11:16:31.709171 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381119 2581 flags.go:64] FLAG: --cgroup-root="" Apr 17 11:16:31.709171 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381122 2581 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 11:16:31.709171 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381125 2581 flags.go:64] FLAG: --client-ca-file="" Apr 17 11:16:31.709171 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381128 2581 flags.go:64] FLAG: --cloud-config="" Apr 17 11:16:31.709171 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381131 2581 flags.go:64] FLAG: --cloud-provider="external" Apr 17 11:16:31.709171 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381134 2581 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 11:16:31.709171 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381138 2581 flags.go:64] FLAG: --cluster-domain="" Apr 17 11:16:31.709171 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381141 2581 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 11:16:31.709171 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381144 2581 flags.go:64] FLAG: --config-dir="" Apr 17 11:16:31.709171 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381147 2581 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 11:16:31.709171 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381150 2581 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 11:16:31.709171 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381154 2581 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 11:16:31.709171 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381157 2581 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 11:16:31.709171 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381160 2581 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 11:16:31.709171 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381164 2581 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 11:16:31.709171 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381167 2581 flags.go:64] FLAG: --contention-profiling="false" Apr 17 11:16:31.709171 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381170 2581 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 11:16:31.709171 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381173 2581 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 11:16:31.709171 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381176 2581 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 11:16:31.709171 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381179 2581 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 11:16:31.709910 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381183 2581 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 11:16:31.709910 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381186 2581 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 11:16:31.709910 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381189 2581 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 11:16:31.709910 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381191 2581 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 11:16:31.709910 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381196 2581 flags.go:64] FLAG: --enable-server="true" Apr 17 11:16:31.709910 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381199 2581 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 11:16:31.709910 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381204 2581 flags.go:64] FLAG: --event-burst="100" Apr 17 11:16:31.709910 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381207 2581 flags.go:64] FLAG: --event-qps="50" Apr 17 11:16:31.709910 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381210 2581 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 11:16:31.709910 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381213 2581 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 11:16:31.709910 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381216 2581 flags.go:64] FLAG: --eviction-hard="" Apr 17 11:16:31.709910 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381220 2581 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 11:16:31.709910 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381223 2581 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 11:16:31.709910 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381226 2581 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 11:16:31.709910 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381229 2581 flags.go:64] FLAG: --eviction-soft="" Apr 17 11:16:31.709910 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381232 2581 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 11:16:31.709910 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381235 2581 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 11:16:31.709910 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381238 2581 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 11:16:31.709910 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381241 2581 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 11:16:31.709910 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381244 2581 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 11:16:31.709910 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381247 2581 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 11:16:31.709910 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381252 2581 flags.go:64] FLAG: --feature-gates="" Apr 17 11:16:31.709910 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381256 2581 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 11:16:31.709910 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381259 2581 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 11:16:31.709910 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381262 2581 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 11:16:31.710769 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381266 2581 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 11:16:31.710769 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381269 2581 flags.go:64] FLAG: --healthz-port="10248" Apr 17 11:16:31.710769 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381272 2581 flags.go:64] FLAG: --help="false" Apr 17 11:16:31.710769 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381274 2581 flags.go:64] FLAG: --hostname-override="ip-10-0-136-177.ec2.internal" Apr 17 11:16:31.710769 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381278 2581 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 11:16:31.710769 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381285 2581 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 11:16:31.710769 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381288 2581 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 11:16:31.710769 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381291 2581 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 11:16:31.710769 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381294 2581 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 11:16:31.710769 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381297 2581 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 11:16:31.710769 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381300 2581 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 11:16:31.710769 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381303 2581 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 11:16:31.710769 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381306 2581 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 11:16:31.710769 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381309 2581 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 11:16:31.710769 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381312 2581 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 11:16:31.710769 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381316 2581 flags.go:64] FLAG: --kube-reserved="" Apr 17 11:16:31.710769 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381319 2581 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 11:16:31.710769 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381321 2581 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 11:16:31.710769 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381324 2581 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 11:16:31.710769 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381328 2581 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 11:16:31.710769 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381331 2581 flags.go:64] FLAG: --lock-file="" Apr 17 11:16:31.710769 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381333 2581 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 11:16:31.710769 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381336 2581 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 11:16:31.710769 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381339 2581 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 11:16:31.711533 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381344 2581 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 11:16:31.711533 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381348 2581 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 11:16:31.711533 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381350 2581 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 11:16:31.711533 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381353 2581 flags.go:64] FLAG: --logging-format="text" Apr 17 11:16:31.711533 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381357 2581 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 11:16:31.711533 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381360 2581 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 11:16:31.711533 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381363 2581 flags.go:64] FLAG: --manifest-url="" Apr 17 11:16:31.711533 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381366 2581 flags.go:64] FLAG: --manifest-url-header="" Apr 17 11:16:31.711533 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381371 2581 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 11:16:31.711533 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381374 2581 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 11:16:31.711533 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381378 2581 flags.go:64] FLAG: --max-pods="110" Apr 17 11:16:31.711533 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381381 2581 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 11:16:31.711533 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381384 2581 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 11:16:31.711533 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381389 2581 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 11:16:31.711533 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381392 2581 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 11:16:31.711533 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381395 2581 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 11:16:31.711533 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381398 2581 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 11:16:31.711533 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381401 2581 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 11:16:31.711533 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381408 2581 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 11:16:31.711533 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381411 2581 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 11:16:31.711533 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381414 2581 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 11:16:31.711533 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381418 2581 flags.go:64] FLAG: --pod-cidr="" Apr 17 11:16:31.711533 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381422 2581 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 11:16:31.712159 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381427 2581 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 11:16:31.712159 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381430 2581 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 11:16:31.712159 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381433 2581 flags.go:64] FLAG: --pods-per-core="0" Apr 17 11:16:31.712159 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381436 2581 flags.go:64] FLAG: --port="10250" Apr 17 11:16:31.712159 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381439 2581 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 11:16:31.712159 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381442 2581 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0d1449bb73031d21b" Apr 17 11:16:31.712159 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381446 2581 flags.go:64] FLAG: --qos-reserved="" Apr 17 11:16:31.712159 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381449 2581 flags.go:64] FLAG: --read-only-port="10255" Apr 17 11:16:31.712159 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381452 2581 flags.go:64] FLAG: --register-node="true" Apr 17 11:16:31.712159 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381455 2581 flags.go:64] FLAG: --register-schedulable="true" Apr 17 11:16:31.712159 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381458 2581 flags.go:64] FLAG: --register-with-taints="" Apr 17 11:16:31.712159 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381461 2581 flags.go:64] FLAG: --registry-burst="10" Apr 17 11:16:31.712159 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381464 2581 flags.go:64] FLAG: --registry-qps="5" Apr 17 11:16:31.712159 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381468 2581 flags.go:64] FLAG: --reserved-cpus="" Apr 17 11:16:31.712159 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381471 2581 flags.go:64] FLAG: --reserved-memory="" Apr 17 11:16:31.712159 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381475 2581 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 11:16:31.712159 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381478 2581 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 11:16:31.712159 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381481 2581 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 11:16:31.712159 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381484 2581 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 11:16:31.712159 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381487 2581 flags.go:64] FLAG: --runonce="false" Apr 17 11:16:31.712159 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381490 2581 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 11:16:31.712159 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381493 2581 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 11:16:31.712159 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381496 2581 flags.go:64] FLAG: --seccomp-default="false" Apr 17 11:16:31.712159 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381500 2581 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 11:16:31.712159 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381504 2581 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 11:16:31.712159 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381507 2581 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 11:16:31.712779 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381510 2581 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 11:16:31.712779 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381513 2581 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 11:16:31.712779 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381516 2581 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 11:16:31.712779 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381519 2581 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 11:16:31.712779 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381522 2581 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 11:16:31.712779 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381525 2581 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 11:16:31.712779 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381529 2581 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 11:16:31.712779 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381532 2581 flags.go:64] FLAG: --system-cgroups="" Apr 17 11:16:31.712779 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381534 2581 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 11:16:31.712779 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381540 2581 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 11:16:31.712779 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381542 2581 flags.go:64] FLAG: --tls-cert-file="" Apr 17 11:16:31.712779 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381545 2581 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 11:16:31.712779 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381549 2581 flags.go:64] FLAG: --tls-min-version="" Apr 17 11:16:31.712779 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381552 2581 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 11:16:31.712779 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381555 2581 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 11:16:31.712779 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381558 2581 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 11:16:31.712779 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381561 2581 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 11:16:31.712779 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381563 2581 flags.go:64] FLAG: --v="2" Apr 17 11:16:31.712779 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381568 2581 flags.go:64] FLAG: --version="false" Apr 17 11:16:31.712779 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381573 2581 flags.go:64] FLAG: --vmodule="" Apr 17 11:16:31.712779 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381582 2581 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 11:16:31.712779 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.381585 2581 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 11:16:31.712779 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381678 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:31.712779 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381681 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:31.713405 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381684 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:31.713405 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381687 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:31.713405 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381690 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:31.713405 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381693 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:31.713405 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381695 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:31.713405 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381699 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:31.713405 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381702 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:31.713405 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381705 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:31.713405 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381707 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:31.713405 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381710 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:31.713405 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381712 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:31.713405 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381714 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:31.713405 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381717 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:31.713405 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381720 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:31.713405 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381723 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:31.713405 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381725 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:31.713405 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381728 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:31.713405 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381730 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:31.713405 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381733 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:31.713405 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381735 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:31.713944 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381738 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:31.713944 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381742 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:31.713944 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381745 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:31.713944 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381748 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:31.713944 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381751 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:31.713944 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381753 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:31.713944 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381756 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:31.713944 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381759 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:31.713944 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381762 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:31.713944 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381776 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:31.713944 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381780 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:31.713944 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381784 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:31.713944 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381787 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:31.713944 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381790 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:31.713944 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381793 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:31.713944 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381795 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:31.713944 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381798 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:31.713944 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381804 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:31.713944 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381806 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:31.714410 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381809 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:31.714410 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381811 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:31.714410 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381831 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:31.714410 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381836 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:31.714410 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381838 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:31.714410 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381841 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:31.714410 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381844 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:31.714410 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381846 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:31.714410 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381857 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:31.714410 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381860 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:31.714410 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381862 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:31.714410 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381865 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:31.714410 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381868 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:31.714410 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381870 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:31.714410 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381873 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:31.714410 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381875 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:31.714410 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381877 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:31.714410 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381880 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:31.714410 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381883 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:31.714410 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381885 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:31.714410 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381889 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:31.714940 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381892 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:31.714940 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381895 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:31.714940 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381897 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:31.714940 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381900 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:31.714940 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381902 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:31.714940 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381905 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:31.714940 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381907 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:31.714940 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381910 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:31.714940 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381912 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:31.714940 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381916 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:31.714940 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381919 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:31.714940 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381921 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:31.714940 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381924 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:31.714940 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381927 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:31.714940 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381929 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:31.714940 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381932 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:31.714940 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381934 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:31.714940 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381937 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:31.714940 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381939 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:31.714940 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381942 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:31.715416 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381945 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:31.715416 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381949 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:31.715416 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381952 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:31.715416 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.381955 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:31.715416 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.382803 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:16:31.715416 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.389721 2581 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 11:16:31.715416 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.389740 2581 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 11:16:31.715416 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389789 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:31.715416 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389794 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:31.715416 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389797 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:31.715416 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389800 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:31.715416 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389803 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:31.715416 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389806 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:31.715416 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389809 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:31.715416 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389812 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:31.715416 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389814 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:31.715847 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389835 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:31.715847 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389840 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:31.715847 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389844 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:31.715847 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389853 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:31.715847 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389856 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:31.715847 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389859 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:31.715847 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389861 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:31.715847 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389864 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:31.715847 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389867 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:31.715847 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389869 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:31.715847 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389872 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:31.715847 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389875 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:31.715847 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389877 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:31.715847 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389881 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:31.715847 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389885 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:31.715847 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389888 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:31.715847 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389892 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:31.715847 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389895 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:31.715847 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389898 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:31.716324 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389901 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:31.716324 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389904 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:31.716324 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389906 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:31.716324 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389909 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:31.716324 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389912 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:31.716324 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389914 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:31.716324 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389917 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:31.716324 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389919 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:31.716324 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389922 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:31.716324 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389925 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:31.716324 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389927 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:31.716324 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389930 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:31.716324 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389933 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:31.716324 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389935 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:31.716324 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389939 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:31.716324 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389943 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:31.716324 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389945 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:31.716324 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389948 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:31.716324 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389951 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:31.716324 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389953 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:31.716845 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389956 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:31.716845 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389958 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:31.716845 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389961 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:31.716845 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389963 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:31.716845 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389965 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:31.716845 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389968 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:31.716845 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389971 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:31.716845 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389974 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:31.716845 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389977 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:31.716845 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389979 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:31.716845 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389983 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:31.716845 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389986 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:31.716845 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389988 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:31.716845 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389991 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:31.716845 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389994 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:31.716845 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389996 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:31.716845 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.389999 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:31.716845 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390001 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:31.716845 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390004 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:31.716845 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390007 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:31.717336 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390009 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:31.717336 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390012 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:31.717336 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390014 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:31.717336 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390017 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:31.717336 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390019 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:31.717336 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390022 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:31.717336 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390024 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:31.717336 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390027 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:31.717336 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390029 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:31.717336 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390032 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:31.717336 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390035 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:31.717336 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390037 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:31.717336 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390040 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:31.717336 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390042 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:31.717336 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390045 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:31.717336 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390048 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:31.717336 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390055 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:31.717336 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390057 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:31.717775 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.390063 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:16:31.717775 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390159 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:31.717775 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390163 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:31.717775 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390167 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:31.717775 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390169 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:31.717775 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390173 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:31.717775 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390175 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:31.717775 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390179 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:31.717775 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390182 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:31.717775 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390184 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:31.717775 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390187 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:31.717775 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390189 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:31.717775 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390192 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:31.717775 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390194 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:31.717775 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390197 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:31.718225 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390200 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:31.718225 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390202 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:31.718225 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390205 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:31.718225 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390207 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:31.718225 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390210 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:31.718225 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390212 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:31.718225 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390215 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:31.718225 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390218 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:31.718225 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390220 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:31.718225 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390223 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:31.718225 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390226 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:31.718225 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390228 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:31.718225 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390231 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:31.718225 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390234 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:31.718225 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390237 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:31.718225 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390239 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:31.718225 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390243 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:31.718225 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390247 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:31.718225 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390250 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:31.718225 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390252 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:31.718717 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390255 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:31.718717 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390257 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:31.718717 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390261 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:31.718717 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390266 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:31.718717 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390268 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:31.718717 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390271 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:31.718717 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390273 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:31.718717 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390276 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:31.718717 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390278 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:31.718717 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390281 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:31.718717 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390283 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:31.718717 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390286 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:31.718717 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390288 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:31.718717 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390291 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:31.718717 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390293 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:31.718717 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390296 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:31.718717 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390299 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:31.718717 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390301 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:31.718717 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390304 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:31.719178 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390306 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:31.719178 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390310 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:31.719178 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390314 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:31.719178 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390316 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:31.719178 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390319 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:31.719178 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390322 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:31.719178 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390324 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:31.719178 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390327 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:31.719178 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390329 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:31.719178 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390333 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:31.719178 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390337 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:31.719178 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390340 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:31.719178 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390342 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:31.719178 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390345 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:31.719178 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390347 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:31.719178 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390350 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:31.719178 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390352 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:31.719178 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390355 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:31.719178 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390358 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:31.719642 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390360 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:31.719642 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390363 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:31.719642 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390366 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:31.719642 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390368 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:31.719642 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390371 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:31.719642 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390374 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:31.719642 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390376 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:31.719642 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390379 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:31.719642 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390381 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:31.719642 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390384 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:31.719642 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390386 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:31.719642 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390389 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:31.719642 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390392 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:31.719642 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:31.390394 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:31.719642 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.390399 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:16:31.719642 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.391182 2581 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 11:16:31.720058 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.399017 2581 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 11:16:31.720058 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.399907 2581 server.go:1019] "Starting client certificate rotation" Apr 17 11:16:31.720058 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.400017 2581 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 11:16:31.720058 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.400057 2581 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 11:16:31.720058 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.423219 2581 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 11:16:31.720058 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.426306 2581 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 11:16:31.720058 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.440219 2581 log.go:25] "Validated CRI v1 runtime API" Apr 17 11:16:31.720058 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.446464 2581 log.go:25] "Validated CRI v1 image API" Apr 17 11:16:31.720058 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.449079 2581 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 11:16:31.720058 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.455552 2581 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 11:16:31.720058 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.456765 2581 fs.go:135] Filesystem UUIDs: map[4ba73975-5916-4ce2-9fb6-18b1921247be:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 ad8d6083-0ae9-48dd-8506-18c98f908ce7:/dev/nvme0n1p4] Apr 17 11:16:31.720058 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.456790 2581 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 11:16:31.720372 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.462203 2581 manager.go:217] Machine: {Timestamp:2026-04-17 11:16:31.460969686 +0000 UTC m=+0.403123918 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101554 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2572e495214cf99b2fa57677c2c372 SystemUUID:ec2572e4-9521-4cf9-9b2f-a57677c2c372 BootID:b1398ba5-80d5-495d-a75b-808c0f955580 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:ef:18:88:f5:f7 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:ef:18:88:f5:f7 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:12:94:99:bf:d8:df Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 11:16:31.720372 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.462307 2581 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 11:16:31.720372 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.462391 2581 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 11:16:31.720372 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.463477 2581 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 11:16:31.720372 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.463496 2581 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-136-177.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 11:16:31.720372 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.463649 2581 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 11:16:31.720372 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.463658 2581 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 11:16:31.720372 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.463671 2581 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 11:16:31.720372 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.463686 2581 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 11:16:31.720372 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.465304 2581 state_mem.go:36] "Initialized new in-memory state store" Apr 17 11:16:31.720372 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.465410 2581 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 11:16:31.720372 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.468086 2581 kubelet.go:491] "Attempting to sync node with API server" Apr 17 11:16:31.720372 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.468097 2581 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 11:16:31.720372 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.468799 2581 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 11:16:31.720372 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.468810 2581 kubelet.go:397] "Adding apiserver pod source" Apr 17 11:16:31.720372 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.468830 2581 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 11:16:31.720372 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.469661 2581 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 11:16:31.720372 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.469677 2581 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.472872 2581 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.474686 2581 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.476518 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.476533 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.476539 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.476545 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.476550 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.476556 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.476565 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.476573 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.476584 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.476593 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.476606 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.476621 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.477506 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.477518 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.481686 2581 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.481726 2581 server.go:1295] "Started kubelet" Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.481831 2581 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.481865 2581 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.481920 2581 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.486881 2581 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-136-177.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:31.487007 2581 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-136-177.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.488244 2581 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-lndpt" Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:31.488279 2581 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.488661 2581 server.go:317] "Adding debug handlers to kubelet server" Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:31.492495 2581 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.492634 2581 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.493428 2581 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-lndpt" Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:31.494781 2581 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-177.ec2.internal.18a720c10988f129 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-177.ec2.internal,UID:ip-10-0-136-177.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-136-177.ec2.internal,},FirstTimestamp:2026-04-17 11:16:31.481696553 +0000 UTC m=+0.423850787,LastTimestamp:2026-04-17 11:16:31.481696553 +0000 UTC m=+0.423850787,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-177.ec2.internal,}" Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.499519 2581 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.499902 2581 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.500702 2581 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.500721 2581 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.500701 2581 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.500866 2581 reconstruct.go:97] "Volume reconstruction finished" Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.500873 2581 reconciler.go:26] "Reconciler: start to sync state" Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:31.500910 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-177.ec2.internal\" not found" Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.509990 2581 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 11:16:31.720857 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.510004 2581 factory.go:55] Registering systemd factory Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.510013 2581 factory.go:223] Registration of the systemd container factory successfully Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.510228 2581 factory.go:153] Registering CRI-O factory Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.510240 2581 factory.go:223] Registration of the crio container factory successfully Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.510262 2581 factory.go:103] Registering Raw factory Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.510278 2581 manager.go:1196] Started watching for new ooms in manager Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.510894 2581 manager.go:319] Starting recovery of all containers Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.512213 2581 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:31.517294 2581 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-136-177.ec2.internal\" not found" node="ip-10-0-136-177.ec2.internal" Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.519601 2581 manager.go:324] Recovery completed Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.527359 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.529982 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-177.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.530006 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-177.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.530035 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-177.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.530482 2581 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.530491 2581 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.530511 2581 state_mem.go:36] "Initialized new in-memory state store" Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.538408 2581 policy_none.go:49] "None policy: Start" Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.538421 2581 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.538430 2581 state_mem.go:35] "Initializing new in-memory state store" Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.578430 2581 manager.go:341] "Starting Device Plugin manager" Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:31.578475 2581 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.578489 2581 server.go:85] "Starting device plugin registration server" Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.578781 2581 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.578793 2581 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.578894 2581 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.578968 2581 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.578976 2581 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:31.579463 2581 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:31.579498 2581 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-136-177.ec2.internal\" not found" Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.655461 2581 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.656698 2581 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.656724 2581 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.656746 2581 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.656757 2581 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:31.656799 2581 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.659412 2581 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.679913 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.680763 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-177.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.680791 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-177.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.680801 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-177.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.680837 2581 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-136-177.ec2.internal" Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.688323 2581 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-136-177.ec2.internal" Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:31.688343 2581 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-136-177.ec2.internal\": node \"ip-10-0-136-177.ec2.internal\" not found" Apr 17 11:16:31.721843 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:31.706314 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-177.ec2.internal\" not found" Apr 17 11:16:31.757424 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.757379 2581 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-177.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-136-177.ec2.internal"] Apr 17 11:16:31.757488 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.757472 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:31.758512 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.758495 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-177.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:31.758590 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.758525 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-177.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:31.758590 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.758535 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-177.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:31.759768 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.759756 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:31.759905 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.759891 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-177.ec2.internal" Apr 17 11:16:31.759937 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.759920 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:31.760494 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.760476 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-177.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:31.760579 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.760508 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-177.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:31.760579 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.760523 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-177.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:31.760579 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.760523 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-177.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:31.760730 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.760610 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-177.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:31.760730 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.760627 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-177.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:31.761786 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.761771 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-177.ec2.internal" Apr 17 11:16:31.761867 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.761799 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:31.762449 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.762428 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-177.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:31.762508 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.762458 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-177.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:31.762508 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.762470 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-177.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:31.798714 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:31.798690 2581 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-177.ec2.internal\" not found" node="ip-10-0-136-177.ec2.internal" Apr 17 11:16:31.802576 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.802556 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/badb9c2bed2c4be58a66e5c2f6172037-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-177.ec2.internal\" (UID: \"badb9c2bed2c4be58a66e5c2f6172037\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-177.ec2.internal" Apr 17 11:16:31.802692 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.802584 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/badb9c2bed2c4be58a66e5c2f6172037-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-177.ec2.internal\" (UID: \"badb9c2bed2c4be58a66e5c2f6172037\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-177.ec2.internal" Apr 17 11:16:31.802692 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.802601 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f24d5b8e4287d839ebd797e520397e0b-config\") pod \"kube-apiserver-proxy-ip-10-0-136-177.ec2.internal\" (UID: \"f24d5b8e4287d839ebd797e520397e0b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-177.ec2.internal" Apr 17 11:16:31.806350 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:31.806335 2581 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-177.ec2.internal\" not found" node="ip-10-0-136-177.ec2.internal" Apr 17 11:16:31.807321 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:31.807302 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-177.ec2.internal\" not found" Apr 17 11:16:31.902755 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.902727 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/badb9c2bed2c4be58a66e5c2f6172037-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-177.ec2.internal\" (UID: \"badb9c2bed2c4be58a66e5c2f6172037\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-177.ec2.internal" Apr 17 11:16:31.902916 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.902764 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f24d5b8e4287d839ebd797e520397e0b-config\") pod \"kube-apiserver-proxy-ip-10-0-136-177.ec2.internal\" (UID: \"f24d5b8e4287d839ebd797e520397e0b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-177.ec2.internal" Apr 17 11:16:31.902916 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.902786 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/badb9c2bed2c4be58a66e5c2f6172037-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-177.ec2.internal\" (UID: \"badb9c2bed2c4be58a66e5c2f6172037\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-177.ec2.internal" Apr 17 11:16:31.902916 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.902855 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/badb9c2bed2c4be58a66e5c2f6172037-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-177.ec2.internal\" (UID: \"badb9c2bed2c4be58a66e5c2f6172037\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-177.ec2.internal" Apr 17 11:16:31.902916 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.902855 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/badb9c2bed2c4be58a66e5c2f6172037-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-177.ec2.internal\" (UID: \"badb9c2bed2c4be58a66e5c2f6172037\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-177.ec2.internal" Apr 17 11:16:31.902916 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:31.902857 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f24d5b8e4287d839ebd797e520397e0b-config\") pod \"kube-apiserver-proxy-ip-10-0-136-177.ec2.internal\" (UID: \"f24d5b8e4287d839ebd797e520397e0b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-177.ec2.internal" Apr 17 11:16:31.907851 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:31.907834 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-177.ec2.internal\" not found" Apr 17 11:16:32.008140 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:32.008048 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-177.ec2.internal\" not found" Apr 17 11:16:32.101327 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:32.101287 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-177.ec2.internal" Apr 17 11:16:32.108168 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:32.108143 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-177.ec2.internal\" not found" Apr 17 11:16:32.109250 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:32.109235 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-177.ec2.internal" Apr 17 11:16:32.208754 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:32.208702 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-177.ec2.internal\" not found" Apr 17 11:16:32.309256 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:32.309180 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-177.ec2.internal\" not found" Apr 17 11:16:32.400422 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:32.400393 2581 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 11:16:32.400939 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:32.400523 2581 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 11:16:32.400939 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:32.400553 2581 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 11:16:32.409738 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:32.409708 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-177.ec2.internal\" not found" Apr 17 11:16:32.496091 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:32.496054 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 11:11:31 +0000 UTC" deadline="2027-11-25 03:14:35.067552997 +0000 UTC" Apr 17 11:16:32.496091 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:32.496089 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14079h58m2.571469171s" Apr 17 11:16:32.500184 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:32.500163 2581 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 11:16:32.504377 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:32.504359 2581 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:32.510386 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:32.510370 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-177.ec2.internal\" not found" Apr 17 11:16:32.519847 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:32.519810 2581 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 11:16:32.574256 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:32.574197 2581 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:32.585417 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:32.585391 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbadb9c2bed2c4be58a66e5c2f6172037.slice/crio-18a33d8778b90b5c79b3c7c180f4020c802d7b27962ad3c725e8fdc2b3467c0f WatchSource:0}: Error finding container 18a33d8778b90b5c79b3c7c180f4020c802d7b27962ad3c725e8fdc2b3467c0f: Status 404 returned error can't find the container with id 18a33d8778b90b5c79b3c7c180f4020c802d7b27962ad3c725e8fdc2b3467c0f Apr 17 11:16:32.585973 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:32.585948 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf24d5b8e4287d839ebd797e520397e0b.slice/crio-323d538e021d8314d02faac9bbd02761cd392f41473a16e8f11a1bf743cabdd1 WatchSource:0}: Error finding container 323d538e021d8314d02faac9bbd02761cd392f41473a16e8f11a1bf743cabdd1: Status 404 returned error can't find the container with id 323d538e021d8314d02faac9bbd02761cd392f41473a16e8f11a1bf743cabdd1 Apr 17 11:16:32.589655 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:32.589637 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 11:16:32.593233 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:32.593214 2581 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-8b9fd" Apr 17 11:16:32.600310 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:32.600294 2581 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-177.ec2.internal" Apr 17 11:16:32.613291 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:32.613270 2581 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-8b9fd" Apr 17 11:16:32.628497 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:32.628480 2581 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 11:16:32.629286 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:32.629275 2581 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-177.ec2.internal" Apr 17 11:16:32.646947 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:32.646931 2581 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 11:16:32.659840 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:32.659785 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-177.ec2.internal" event={"ID":"badb9c2bed2c4be58a66e5c2f6172037","Type":"ContainerStarted","Data":"18a33d8778b90b5c79b3c7c180f4020c802d7b27962ad3c725e8fdc2b3467c0f"} Apr 17 11:16:32.660665 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:32.660644 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-177.ec2.internal" event={"ID":"f24d5b8e4287d839ebd797e520397e0b","Type":"ContainerStarted","Data":"323d538e021d8314d02faac9bbd02761cd392f41473a16e8f11a1bf743cabdd1"} Apr 17 11:16:33.469505 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.469470 2581 apiserver.go:52] "Watching apiserver" Apr 17 11:16:33.476312 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.476287 2581 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 11:16:33.478396 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.478369 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-q4q4t","openshift-cluster-node-tuning-operator/tuned-ldnwz","openshift-dns/node-resolver-8mkn6","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-177.ec2.internal","openshift-multus/multus-additional-cni-plugins-b92zr","openshift-multus/multus-szct7","openshift-multus/network-metrics-daemon-9k5nh","openshift-network-operator/iptables-alerter-fvtct","kube-system/konnectivity-agent-qxx8v","kube-system/kube-apiserver-proxy-ip-10-0-136-177.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dpf96","openshift-image-registry/node-ca-sdhv4","openshift-network-diagnostics/network-check-target-rdxhf"] Apr 17 11:16:33.479791 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.479765 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sdhv4" Apr 17 11:16:33.482243 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.482154 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.482463 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.482443 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 11:16:33.482562 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.482545 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 11:16:33.482643 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.482577 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-8pfgg\"" Apr 17 11:16:33.482719 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.482582 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 11:16:33.483411 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.483391 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8mkn6" Apr 17 11:16:33.483563 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.483541 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.485150 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.484733 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-b92zr" Apr 17 11:16:33.485150 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.484864 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 11:16:33.485150 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.484878 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:16:33.485150 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.484950 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-pb47b\"" Apr 17 11:16:33.485710 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.485693 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 11:16:33.485863 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.485846 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-2sgfg\"" Apr 17 11:16:33.486126 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.486106 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-szct7" Apr 17 11:16:33.486225 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.486206 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 11:16:33.486475 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.486456 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 11:16:33.486606 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.486501 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 11:16:33.486606 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.486510 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 11:16:33.486606 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.486456 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 11:16:33.486866 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.486675 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 11:16:33.486866 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.486804 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 11:16:33.487558 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.487522 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 11:16:33.487558 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.487552 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 11:16:33.487694 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.487536 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 11:16:33.487874 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.487800 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-4fnqs\"" Apr 17 11:16:33.487874 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.487813 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9k5nh" Apr 17 11:16:33.488017 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.487877 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 11:16:33.488017 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:33.487941 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9k5nh" podUID="bee0bc88-7732-4010-9886-3df7384bf1c8" Apr 17 11:16:33.488017 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.487990 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 11:16:33.488160 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.488068 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-x5tfn\"" Apr 17 11:16:33.489208 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.489106 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-fvtct" Apr 17 11:16:33.491313 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.490838 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-qxx8v" Apr 17 11:16:33.491588 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.491565 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 11:16:33.492174 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.491690 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-jx9lb\"" Apr 17 11:16:33.492901 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.492878 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 11:16:33.492979 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.492900 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:16:33.492979 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.492891 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-ct2lr\"" Apr 17 11:16:33.492979 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.492939 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 11:16:33.494049 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.494022 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 11:16:33.494195 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.494136 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-gxl5d\"" Apr 17 11:16:33.494195 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.494149 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 11:16:33.494386 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.494371 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rdxhf" Apr 17 11:16:33.494430 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.494397 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dpf96" Apr 17 11:16:33.494480 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:33.494446 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rdxhf" podUID="1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f" Apr 17 11:16:33.496937 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.496921 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-h22lq\"" Apr 17 11:16:33.497038 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.496954 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 11:16:33.497202 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.497188 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 11:16:33.497296 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.497222 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 11:16:33.501718 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.501699 2581 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 11:16:33.504076 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.504057 2581 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:33.509370 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.509350 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-cni-binary-copy\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.509511 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.509385 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/32f662eb-3388-4bc8-9550-6e567567f548-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dpf96\" (UID: \"32f662eb-3388-4bc8-9550-6e567567f548\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dpf96" Apr 17 11:16:33.509511 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.509448 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/32f662eb-3388-4bc8-9550-6e567567f548-device-dir\") pod \"aws-ebs-csi-driver-node-dpf96\" (UID: \"32f662eb-3388-4bc8-9550-6e567567f548\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dpf96" Apr 17 11:16:33.509511 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.509485 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmbsb\" (UniqueName: \"kubernetes.io/projected/32f662eb-3388-4bc8-9550-6e567567f548-kube-api-access-vmbsb\") pod \"aws-ebs-csi-driver-node-dpf96\" (UID: \"32f662eb-3388-4bc8-9550-6e567567f548\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dpf96" Apr 17 11:16:33.509664 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.509510 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/38f8cb42-d739-4806-98ed-206508f9cc9c-etc-modprobe-d\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.509664 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.509532 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38f8cb42-d739-4806-98ed-206508f9cc9c-host\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.509664 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.509556 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/38f8cb42-d739-4806-98ed-206508f9cc9c-etc-tuned\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.509664 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.509601 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a3311f8e-8452-4224-8b40-1d0392b66a65-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-b92zr\" (UID: \"a3311f8e-8452-4224-8b40-1d0392b66a65\") " pod="openshift-multus/multus-additional-cni-plugins-b92zr" Apr 17 11:16:33.509987 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.509665 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-multus-cni-dir\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.509987 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.509694 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmrng\" (UniqueName: \"kubernetes.io/projected/bee0bc88-7732-4010-9886-3df7384bf1c8-kube-api-access-gmrng\") pod \"network-metrics-daemon-9k5nh\" (UID: \"bee0bc88-7732-4010-9886-3df7384bf1c8\") " pod="openshift-multus/network-metrics-daemon-9k5nh" Apr 17 11:16:33.509987 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.509720 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/38f8cb42-d739-4806-98ed-206508f9cc9c-lib-modules\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.509987 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.509747 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-host-cni-netd\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.509987 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.509771 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-ovn-node-metrics-cert\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.509987 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.509810 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3b8f2274-31f6-413e-944a-132f5e8db8f6-serviceca\") pod \"node-ca-sdhv4\" (UID: \"3b8f2274-31f6-413e-944a-132f5e8db8f6\") " pod="openshift-image-registry/node-ca-sdhv4" Apr 17 11:16:33.509987 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.509876 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/38f8cb42-d739-4806-98ed-206508f9cc9c-etc-systemd\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.509987 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.509900 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-env-overrides\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.509987 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.509931 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-hostroot\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.509987 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.509957 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/38f8cb42-d739-4806-98ed-206508f9cc9c-run\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.510392 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.509994 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-systemd-units\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.510392 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.510049 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-run-ovn\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.510392 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.510089 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-host-cni-bin\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.510392 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.510114 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ba8cdaa8-b4aa-4821-9350-c03f9eb0b50d-agent-certs\") pod \"konnectivity-agent-qxx8v\" (UID: \"ba8cdaa8-b4aa-4821-9350-c03f9eb0b50d\") " pod="kube-system/konnectivity-agent-qxx8v" Apr 17 11:16:33.510392 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.510137 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/38f8cb42-d739-4806-98ed-206508f9cc9c-sys\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.510392 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.510161 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzdml\" (UniqueName: \"kubernetes.io/projected/a3311f8e-8452-4224-8b40-1d0392b66a65-kube-api-access-tzdml\") pod \"multus-additional-cni-plugins-b92zr\" (UID: \"a3311f8e-8452-4224-8b40-1d0392b66a65\") " pod="openshift-multus/multus-additional-cni-plugins-b92zr" Apr 17 11:16:33.510392 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.510191 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-cnibin\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.510392 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.510225 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-run-openvswitch\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.510392 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.510265 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbwzg\" (UniqueName: \"kubernetes.io/projected/1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f-kube-api-access-dbwzg\") pod \"network-check-target-rdxhf\" (UID: \"1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f\") " pod="openshift-network-diagnostics/network-check-target-rdxhf" Apr 17 11:16:33.510392 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.510300 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-host-run-netns\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.510392 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.510325 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-etc-openvswitch\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.510392 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.510352 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-host-run-ovn-kubernetes\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.510392 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.510378 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ld8c\" (UniqueName: \"kubernetes.io/projected/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-kube-api-access-5ld8c\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.510923 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.510424 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/32f662eb-3388-4bc8-9550-6e567567f548-etc-selinux\") pod \"aws-ebs-csi-driver-node-dpf96\" (UID: \"32f662eb-3388-4bc8-9550-6e567567f548\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dpf96" Apr 17 11:16:33.510923 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.510458 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ba8cdaa8-b4aa-4821-9350-c03f9eb0b50d-konnectivity-ca\") pod \"konnectivity-agent-qxx8v\" (UID: \"ba8cdaa8-b4aa-4821-9350-c03f9eb0b50d\") " pod="kube-system/konnectivity-agent-qxx8v" Apr 17 11:16:33.510923 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.510491 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-system-cni-dir\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.510923 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.510517 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-etc-kubernetes\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.510923 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.510540 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bee0bc88-7732-4010-9886-3df7384bf1c8-metrics-certs\") pod \"network-metrics-daemon-9k5nh\" (UID: \"bee0bc88-7732-4010-9886-3df7384bf1c8\") " pod="openshift-multus/network-metrics-daemon-9k5nh" Apr 17 11:16:33.510923 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.510565 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-host-var-lib-kubelet\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.510923 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.510588 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/38f8cb42-d739-4806-98ed-206508f9cc9c-var-lib-kubelet\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.510923 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.510612 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5ee7c1ba-cbb5-4b2b-b2f7-23d447b10b96-tmp-dir\") pod \"node-resolver-8mkn6\" (UID: \"5ee7c1ba-cbb5-4b2b-b2f7-23d447b10b96\") " pod="openshift-dns/node-resolver-8mkn6" Apr 17 11:16:33.510923 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.510635 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-host-kubelet\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.510923 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.510658 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-ovnkube-script-lib\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.510923 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.510682 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dxjv\" (UniqueName: \"kubernetes.io/projected/38f8cb42-d739-4806-98ed-206508f9cc9c-kube-api-access-5dxjv\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.510923 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.510704 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-node-log\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.510923 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.510727 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2mxf\" (UniqueName: \"kubernetes.io/projected/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-kube-api-access-s2mxf\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.510923 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.510749 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnxzl\" (UniqueName: \"kubernetes.io/projected/3b8f2274-31f6-413e-944a-132f5e8db8f6-kube-api-access-fnxzl\") pod \"node-ca-sdhv4\" (UID: \"3b8f2274-31f6-413e-944a-132f5e8db8f6\") " pod="openshift-image-registry/node-ca-sdhv4" Apr 17 11:16:33.510923 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.510774 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/38f8cb42-d739-4806-98ed-206508f9cc9c-etc-sysconfig\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.510923 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.510797 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-host-slash\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.511594 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.510840 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvc4x\" (UniqueName: \"kubernetes.io/projected/505d1e1b-000d-4203-b021-c56c5c5d8c56-kube-api-access-cvc4x\") pod \"iptables-alerter-fvtct\" (UID: \"505d1e1b-000d-4203-b021-c56c5c5d8c56\") " pod="openshift-network-operator/iptables-alerter-fvtct" Apr 17 11:16:33.511594 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.510877 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-host-run-netns\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.511594 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.510909 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-multus-daemon-config\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.511594 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.510942 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/38f8cb42-d739-4806-98ed-206508f9cc9c-etc-sysctl-conf\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.511594 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.510968 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-run-systemd\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.511594 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.510991 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/505d1e1b-000d-4203-b021-c56c5c5d8c56-host-slash\") pod \"iptables-alerter-fvtct\" (UID: \"505d1e1b-000d-4203-b021-c56c5c5d8c56\") " pod="openshift-network-operator/iptables-alerter-fvtct" Apr 17 11:16:33.511594 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.511017 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/38f8cb42-d739-4806-98ed-206508f9cc9c-etc-sysctl-d\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.511594 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.511034 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-host-run-multus-certs\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.511594 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.511048 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a3311f8e-8452-4224-8b40-1d0392b66a65-cnibin\") pod \"multus-additional-cni-plugins-b92zr\" (UID: \"a3311f8e-8452-4224-8b40-1d0392b66a65\") " pod="openshift-multus/multus-additional-cni-plugins-b92zr" Apr 17 11:16:33.511594 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.511073 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a3311f8e-8452-4224-8b40-1d0392b66a65-cni-binary-copy\") pod \"multus-additional-cni-plugins-b92zr\" (UID: \"a3311f8e-8452-4224-8b40-1d0392b66a65\") " pod="openshift-multus/multus-additional-cni-plugins-b92zr" Apr 17 11:16:33.511594 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.511118 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a3311f8e-8452-4224-8b40-1d0392b66a65-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b92zr\" (UID: \"a3311f8e-8452-4224-8b40-1d0392b66a65\") " pod="openshift-multus/multus-additional-cni-plugins-b92zr" Apr 17 11:16:33.511594 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.511156 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-host-var-lib-cni-bin\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.511594 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.511182 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3b8f2274-31f6-413e-944a-132f5e8db8f6-host\") pod \"node-ca-sdhv4\" (UID: \"3b8f2274-31f6-413e-944a-132f5e8db8f6\") " pod="openshift-image-registry/node-ca-sdhv4" Apr 17 11:16:33.511594 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.511207 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nqpn\" (UniqueName: \"kubernetes.io/projected/5ee7c1ba-cbb5-4b2b-b2f7-23d447b10b96-kube-api-access-8nqpn\") pod \"node-resolver-8mkn6\" (UID: \"5ee7c1ba-cbb5-4b2b-b2f7-23d447b10b96\") " pod="openshift-dns/node-resolver-8mkn6" Apr 17 11:16:33.511594 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.511231 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a3311f8e-8452-4224-8b40-1d0392b66a65-system-cni-dir\") pod \"multus-additional-cni-plugins-b92zr\" (UID: \"a3311f8e-8452-4224-8b40-1d0392b66a65\") " pod="openshift-multus/multus-additional-cni-plugins-b92zr" Apr 17 11:16:33.511594 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.511255 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a3311f8e-8452-4224-8b40-1d0392b66a65-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b92zr\" (UID: \"a3311f8e-8452-4224-8b40-1d0392b66a65\") " pod="openshift-multus/multus-additional-cni-plugins-b92zr" Apr 17 11:16:33.512186 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.511279 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-multus-socket-dir-parent\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.512186 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.511303 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/32f662eb-3388-4bc8-9550-6e567567f548-socket-dir\") pod \"aws-ebs-csi-driver-node-dpf96\" (UID: \"32f662eb-3388-4bc8-9550-6e567567f548\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dpf96" Apr 17 11:16:33.512186 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.511326 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/32f662eb-3388-4bc8-9550-6e567567f548-sys-fs\") pod \"aws-ebs-csi-driver-node-dpf96\" (UID: \"32f662eb-3388-4bc8-9550-6e567567f548\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dpf96" Apr 17 11:16:33.512186 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.511357 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-ovnkube-config\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.512186 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.511387 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/505d1e1b-000d-4203-b021-c56c5c5d8c56-iptables-alerter-script\") pod \"iptables-alerter-fvtct\" (UID: \"505d1e1b-000d-4203-b021-c56c5c5d8c56\") " pod="openshift-network-operator/iptables-alerter-fvtct" Apr 17 11:16:33.512186 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.511437 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a3311f8e-8452-4224-8b40-1d0392b66a65-os-release\") pod \"multus-additional-cni-plugins-b92zr\" (UID: \"a3311f8e-8452-4224-8b40-1d0392b66a65\") " pod="openshift-multus/multus-additional-cni-plugins-b92zr" Apr 17 11:16:33.512186 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.511467 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-os-release\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.512186 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.511494 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-host-var-lib-cni-multus\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.512186 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.511529 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-multus-conf-dir\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.512186 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.511550 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/32f662eb-3388-4bc8-9550-6e567567f548-registration-dir\") pod \"aws-ebs-csi-driver-node-dpf96\" (UID: \"32f662eb-3388-4bc8-9550-6e567567f548\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dpf96" Apr 17 11:16:33.512186 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.511573 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/38f8cb42-d739-4806-98ed-206508f9cc9c-tmp\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.512186 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.511590 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5ee7c1ba-cbb5-4b2b-b2f7-23d447b10b96-hosts-file\") pod \"node-resolver-8mkn6\" (UID: \"5ee7c1ba-cbb5-4b2b-b2f7-23d447b10b96\") " pod="openshift-dns/node-resolver-8mkn6" Apr 17 11:16:33.512186 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.511604 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-log-socket\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.512186 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.511622 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-host-run-k8s-cni-cncf-io\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.512186 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.511644 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/38f8cb42-d739-4806-98ed-206508f9cc9c-etc-kubernetes\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.512186 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.511680 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-var-lib-openvswitch\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.512738 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.511712 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.612389 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.612359 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3b8f2274-31f6-413e-944a-132f5e8db8f6-serviceca\") pod \"node-ca-sdhv4\" (UID: \"3b8f2274-31f6-413e-944a-132f5e8db8f6\") " pod="openshift-image-registry/node-ca-sdhv4" Apr 17 11:16:33.612389 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.612392 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/38f8cb42-d739-4806-98ed-206508f9cc9c-etc-systemd\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.612605 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.612407 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-env-overrides\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.612605 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.612422 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-hostroot\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.612605 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.612436 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/38f8cb42-d739-4806-98ed-206508f9cc9c-run\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.612605 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.612502 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/38f8cb42-d739-4806-98ed-206508f9cc9c-run\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.612605 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.612503 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/38f8cb42-d739-4806-98ed-206508f9cc9c-etc-systemd\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.612605 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.612538 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-systemd-units\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.612605 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.612552 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-hostroot\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.612605 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.612564 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-run-ovn\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.612605 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.612586 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-systemd-units\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.612605 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.612590 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-host-cni-bin\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.612982 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.612618 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ba8cdaa8-b4aa-4821-9350-c03f9eb0b50d-agent-certs\") pod \"konnectivity-agent-qxx8v\" (UID: \"ba8cdaa8-b4aa-4821-9350-c03f9eb0b50d\") " pod="kube-system/konnectivity-agent-qxx8v" Apr 17 11:16:33.612982 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.612622 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-run-ovn\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.612982 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.612641 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/38f8cb42-d739-4806-98ed-206508f9cc9c-sys\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.612982 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.612657 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-host-cni-bin\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.612982 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.612667 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzdml\" (UniqueName: \"kubernetes.io/projected/a3311f8e-8452-4224-8b40-1d0392b66a65-kube-api-access-tzdml\") pod \"multus-additional-cni-plugins-b92zr\" (UID: \"a3311f8e-8452-4224-8b40-1d0392b66a65\") " pod="openshift-multus/multus-additional-cni-plugins-b92zr" Apr 17 11:16:33.612982 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.612691 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-cnibin\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.612982 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.612713 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-run-openvswitch\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.612982 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.612736 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dbwzg\" (UniqueName: \"kubernetes.io/projected/1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f-kube-api-access-dbwzg\") pod \"network-check-target-rdxhf\" (UID: \"1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f\") " pod="openshift-network-diagnostics/network-check-target-rdxhf" Apr 17 11:16:33.612982 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.612759 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-host-run-netns\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.612982 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.612783 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-etc-openvswitch\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.612982 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.612804 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-host-run-ovn-kubernetes\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.612982 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.612844 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ld8c\" (UniqueName: \"kubernetes.io/projected/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-kube-api-access-5ld8c\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.612982 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.612868 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/32f662eb-3388-4bc8-9550-6e567567f548-etc-selinux\") pod \"aws-ebs-csi-driver-node-dpf96\" (UID: \"32f662eb-3388-4bc8-9550-6e567567f548\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dpf96" Apr 17 11:16:33.612982 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.612890 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ba8cdaa8-b4aa-4821-9350-c03f9eb0b50d-konnectivity-ca\") pod \"konnectivity-agent-qxx8v\" (UID: \"ba8cdaa8-b4aa-4821-9350-c03f9eb0b50d\") " pod="kube-system/konnectivity-agent-qxx8v" Apr 17 11:16:33.612982 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.612933 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-system-cni-dir\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.612982 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.612939 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-env-overrides\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.613584 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.613003 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-system-cni-dir\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.613584 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.612939 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3b8f2274-31f6-413e-944a-132f5e8db8f6-serviceca\") pod \"node-ca-sdhv4\" (UID: \"3b8f2274-31f6-413e-944a-132f5e8db8f6\") " pod="openshift-image-registry/node-ca-sdhv4" Apr 17 11:16:33.613584 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.613033 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-run-openvswitch\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.613584 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.613048 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-host-run-netns\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.613584 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.613061 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-etc-openvswitch\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.613584 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.613088 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-cnibin\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.613584 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.613104 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/38f8cb42-d739-4806-98ed-206508f9cc9c-sys\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.613584 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.613004 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-etc-kubernetes\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.613584 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.613139 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bee0bc88-7732-4010-9886-3df7384bf1c8-metrics-certs\") pod \"network-metrics-daemon-9k5nh\" (UID: \"bee0bc88-7732-4010-9886-3df7384bf1c8\") " pod="openshift-multus/network-metrics-daemon-9k5nh" Apr 17 11:16:33.613584 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.613162 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-host-var-lib-kubelet\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.613584 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.613221 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/38f8cb42-d739-4806-98ed-206508f9cc9c-var-lib-kubelet\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.613584 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.613245 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5ee7c1ba-cbb5-4b2b-b2f7-23d447b10b96-tmp-dir\") pod \"node-resolver-8mkn6\" (UID: \"5ee7c1ba-cbb5-4b2b-b2f7-23d447b10b96\") " pod="openshift-dns/node-resolver-8mkn6" Apr 17 11:16:33.613584 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.613269 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-host-kubelet\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.613584 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.613294 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-ovnkube-script-lib\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.613584 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.613316 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-host-run-ovn-kubernetes\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.613584 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.613327 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dxjv\" (UniqueName: \"kubernetes.io/projected/38f8cb42-d739-4806-98ed-206508f9cc9c-kube-api-access-5dxjv\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.613584 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.613349 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-node-log\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.613584 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.613367 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-etc-kubernetes\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.614259 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.613417 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s2mxf\" (UniqueName: \"kubernetes.io/projected/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-kube-api-access-s2mxf\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.614259 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.613423 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/38f8cb42-d739-4806-98ed-206508f9cc9c-var-lib-kubelet\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.614259 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.613444 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fnxzl\" (UniqueName: \"kubernetes.io/projected/3b8f2274-31f6-413e-944a-132f5e8db8f6-kube-api-access-fnxzl\") pod \"node-ca-sdhv4\" (UID: \"3b8f2274-31f6-413e-944a-132f5e8db8f6\") " pod="openshift-image-registry/node-ca-sdhv4" Apr 17 11:16:33.614259 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.613467 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/38f8cb42-d739-4806-98ed-206508f9cc9c-etc-sysconfig\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.614259 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.613487 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-host-slash\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.614259 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.613503 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ba8cdaa8-b4aa-4821-9350-c03f9eb0b50d-konnectivity-ca\") pod \"konnectivity-agent-qxx8v\" (UID: \"ba8cdaa8-b4aa-4821-9350-c03f9eb0b50d\") " pod="kube-system/konnectivity-agent-qxx8v" Apr 17 11:16:33.614259 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.613510 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvc4x\" (UniqueName: \"kubernetes.io/projected/505d1e1b-000d-4203-b021-c56c5c5d8c56-kube-api-access-cvc4x\") pod \"iptables-alerter-fvtct\" (UID: \"505d1e1b-000d-4203-b021-c56c5c5d8c56\") " pod="openshift-network-operator/iptables-alerter-fvtct" Apr 17 11:16:33.614259 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.613523 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/32f662eb-3388-4bc8-9550-6e567567f548-etc-selinux\") pod \"aws-ebs-csi-driver-node-dpf96\" (UID: \"32f662eb-3388-4bc8-9550-6e567567f548\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dpf96" Apr 17 11:16:33.614259 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.613536 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-host-run-netns\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.614259 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.613552 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-host-kubelet\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.614259 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.613640 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-host-var-lib-kubelet\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.614259 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.613703 2581 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 11:16:33.614259 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:33.613787 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:33.614259 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:33.613906 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bee0bc88-7732-4010-9886-3df7384bf1c8-metrics-certs podName:bee0bc88-7732-4010-9886-3df7384bf1c8 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:34.113884158 +0000 UTC m=+3.056038395 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bee0bc88-7732-4010-9886-3df7384bf1c8-metrics-certs") pod "network-metrics-daemon-9k5nh" (UID: "bee0bc88-7732-4010-9886-3df7384bf1c8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:33.614259 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.613905 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5ee7c1ba-cbb5-4b2b-b2f7-23d447b10b96-tmp-dir\") pod \"node-resolver-8mkn6\" (UID: \"5ee7c1ba-cbb5-4b2b-b2f7-23d447b10b96\") " pod="openshift-dns/node-resolver-8mkn6" Apr 17 11:16:33.614259 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.613953 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-node-log\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.614259 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.613991 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-host-run-netns\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.614259 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.614032 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-host-slash\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.615094 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.614066 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-multus-daemon-config\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.615094 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.614095 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/38f8cb42-d739-4806-98ed-206508f9cc9c-etc-sysconfig\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.615094 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.614124 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 11:11:32 +0000 UTC" deadline="2027-09-17 01:23:56.121447682 +0000 UTC" Apr 17 11:16:33.615094 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.614141 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12422h7m22.507309137s" Apr 17 11:16:33.615094 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.614130 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/38f8cb42-d739-4806-98ed-206508f9cc9c-etc-sysctl-conf\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.615094 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.614190 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-run-systemd\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.615094 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.614236 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/38f8cb42-d739-4806-98ed-206508f9cc9c-etc-sysctl-conf\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.615094 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.614241 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-run-systemd\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.615094 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.614269 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/505d1e1b-000d-4203-b021-c56c5c5d8c56-host-slash\") pod \"iptables-alerter-fvtct\" (UID: \"505d1e1b-000d-4203-b021-c56c5c5d8c56\") " pod="openshift-network-operator/iptables-alerter-fvtct" Apr 17 11:16:33.615094 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.614391 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-ovnkube-script-lib\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.615094 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.614402 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/38f8cb42-d739-4806-98ed-206508f9cc9c-etc-sysctl-d\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.615094 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.614439 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/505d1e1b-000d-4203-b021-c56c5c5d8c56-host-slash\") pod \"iptables-alerter-fvtct\" (UID: \"505d1e1b-000d-4203-b021-c56c5c5d8c56\") " pod="openshift-network-operator/iptables-alerter-fvtct" Apr 17 11:16:33.615094 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.614489 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-host-run-multus-certs\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.615094 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.614537 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-host-run-multus-certs\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.615094 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.614564 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a3311f8e-8452-4224-8b40-1d0392b66a65-cnibin\") pod \"multus-additional-cni-plugins-b92zr\" (UID: \"a3311f8e-8452-4224-8b40-1d0392b66a65\") " pod="openshift-multus/multus-additional-cni-plugins-b92zr" Apr 17 11:16:33.615094 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.614605 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/38f8cb42-d739-4806-98ed-206508f9cc9c-etc-sysctl-d\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.615094 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.614621 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a3311f8e-8452-4224-8b40-1d0392b66a65-cni-binary-copy\") pod \"multus-additional-cni-plugins-b92zr\" (UID: \"a3311f8e-8452-4224-8b40-1d0392b66a65\") " pod="openshift-multus/multus-additional-cni-plugins-b92zr" Apr 17 11:16:33.615094 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.614652 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a3311f8e-8452-4224-8b40-1d0392b66a65-cnibin\") pod \"multus-additional-cni-plugins-b92zr\" (UID: \"a3311f8e-8452-4224-8b40-1d0392b66a65\") " pod="openshift-multus/multus-additional-cni-plugins-b92zr" Apr 17 11:16:33.615895 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.614695 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a3311f8e-8452-4224-8b40-1d0392b66a65-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b92zr\" (UID: \"a3311f8e-8452-4224-8b40-1d0392b66a65\") " pod="openshift-multus/multus-additional-cni-plugins-b92zr" Apr 17 11:16:33.615895 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.614706 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-multus-daemon-config\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.615895 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.614722 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-host-var-lib-cni-bin\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.615895 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.614748 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3b8f2274-31f6-413e-944a-132f5e8db8f6-host\") pod \"node-ca-sdhv4\" (UID: \"3b8f2274-31f6-413e-944a-132f5e8db8f6\") " pod="openshift-image-registry/node-ca-sdhv4" Apr 17 11:16:33.615895 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.614784 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nqpn\" (UniqueName: \"kubernetes.io/projected/5ee7c1ba-cbb5-4b2b-b2f7-23d447b10b96-kube-api-access-8nqpn\") pod \"node-resolver-8mkn6\" (UID: \"5ee7c1ba-cbb5-4b2b-b2f7-23d447b10b96\") " pod="openshift-dns/node-resolver-8mkn6" Apr 17 11:16:33.615895 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.614809 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a3311f8e-8452-4224-8b40-1d0392b66a65-system-cni-dir\") pod \"multus-additional-cni-plugins-b92zr\" (UID: \"a3311f8e-8452-4224-8b40-1d0392b66a65\") " pod="openshift-multus/multus-additional-cni-plugins-b92zr" Apr 17 11:16:33.615895 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.614854 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a3311f8e-8452-4224-8b40-1d0392b66a65-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b92zr\" (UID: \"a3311f8e-8452-4224-8b40-1d0392b66a65\") " pod="openshift-multus/multus-additional-cni-plugins-b92zr" Apr 17 11:16:33.615895 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.614905 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-multus-socket-dir-parent\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.615895 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.614937 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-host-var-lib-cni-bin\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.615895 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.614947 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a3311f8e-8452-4224-8b40-1d0392b66a65-system-cni-dir\") pod \"multus-additional-cni-plugins-b92zr\" (UID: \"a3311f8e-8452-4224-8b40-1d0392b66a65\") " pod="openshift-multus/multus-additional-cni-plugins-b92zr" Apr 17 11:16:33.615895 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.614945 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/32f662eb-3388-4bc8-9550-6e567567f548-socket-dir\") pod \"aws-ebs-csi-driver-node-dpf96\" (UID: \"32f662eb-3388-4bc8-9550-6e567567f548\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dpf96" Apr 17 11:16:33.615895 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.614983 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3b8f2274-31f6-413e-944a-132f5e8db8f6-host\") pod \"node-ca-sdhv4\" (UID: \"3b8f2274-31f6-413e-944a-132f5e8db8f6\") " pod="openshift-image-registry/node-ca-sdhv4" Apr 17 11:16:33.615895 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615020 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/32f662eb-3388-4bc8-9550-6e567567f548-sys-fs\") pod \"aws-ebs-csi-driver-node-dpf96\" (UID: \"32f662eb-3388-4bc8-9550-6e567567f548\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dpf96" Apr 17 11:16:33.615895 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615046 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-ovnkube-config\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.615895 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615051 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a3311f8e-8452-4224-8b40-1d0392b66a65-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b92zr\" (UID: \"a3311f8e-8452-4224-8b40-1d0392b66a65\") " pod="openshift-multus/multus-additional-cni-plugins-b92zr" Apr 17 11:16:33.615895 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615085 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/505d1e1b-000d-4203-b021-c56c5c5d8c56-iptables-alerter-script\") pod \"iptables-alerter-fvtct\" (UID: \"505d1e1b-000d-4203-b021-c56c5c5d8c56\") " pod="openshift-network-operator/iptables-alerter-fvtct" Apr 17 11:16:33.615895 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615118 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/32f662eb-3388-4bc8-9550-6e567567f548-socket-dir\") pod \"aws-ebs-csi-driver-node-dpf96\" (UID: \"32f662eb-3388-4bc8-9550-6e567567f548\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dpf96" Apr 17 11:16:33.616645 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615120 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/32f662eb-3388-4bc8-9550-6e567567f548-sys-fs\") pod \"aws-ebs-csi-driver-node-dpf96\" (UID: \"32f662eb-3388-4bc8-9550-6e567567f548\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dpf96" Apr 17 11:16:33.616645 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615173 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a3311f8e-8452-4224-8b40-1d0392b66a65-os-release\") pod \"multus-additional-cni-plugins-b92zr\" (UID: \"a3311f8e-8452-4224-8b40-1d0392b66a65\") " pod="openshift-multus/multus-additional-cni-plugins-b92zr" Apr 17 11:16:33.616645 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615194 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-multus-socket-dir-parent\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.616645 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615202 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-os-release\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.616645 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615246 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-host-var-lib-cni-multus\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.616645 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615256 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a3311f8e-8452-4224-8b40-1d0392b66a65-os-release\") pod \"multus-additional-cni-plugins-b92zr\" (UID: \"a3311f8e-8452-4224-8b40-1d0392b66a65\") " pod="openshift-multus/multus-additional-cni-plugins-b92zr" Apr 17 11:16:33.616645 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615262 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-os-release\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.616645 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615277 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-multus-conf-dir\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.616645 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615304 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-host-var-lib-cni-multus\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.616645 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615306 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/32f662eb-3388-4bc8-9550-6e567567f548-registration-dir\") pod \"aws-ebs-csi-driver-node-dpf96\" (UID: \"32f662eb-3388-4bc8-9550-6e567567f548\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dpf96" Apr 17 11:16:33.616645 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615343 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/38f8cb42-d739-4806-98ed-206508f9cc9c-tmp\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.616645 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615354 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/32f662eb-3388-4bc8-9550-6e567567f548-registration-dir\") pod \"aws-ebs-csi-driver-node-dpf96\" (UID: \"32f662eb-3388-4bc8-9550-6e567567f548\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dpf96" Apr 17 11:16:33.616645 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615368 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5ee7c1ba-cbb5-4b2b-b2f7-23d447b10b96-hosts-file\") pod \"node-resolver-8mkn6\" (UID: \"5ee7c1ba-cbb5-4b2b-b2f7-23d447b10b96\") " pod="openshift-dns/node-resolver-8mkn6" Apr 17 11:16:33.616645 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615374 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-multus-conf-dir\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.616645 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615392 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-log-socket\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.616645 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615414 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-host-run-k8s-cni-cncf-io\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.616645 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615436 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/38f8cb42-d739-4806-98ed-206508f9cc9c-etc-kubernetes\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.616645 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615451 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-ovnkube-config\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.617342 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615472 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-log-socket\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.617342 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615474 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-host-run-k8s-cni-cncf-io\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.617342 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615481 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5ee7c1ba-cbb5-4b2b-b2f7-23d447b10b96-hosts-file\") pod \"node-resolver-8mkn6\" (UID: \"5ee7c1ba-cbb5-4b2b-b2f7-23d447b10b96\") " pod="openshift-dns/node-resolver-8mkn6" Apr 17 11:16:33.617342 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615487 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-var-lib-openvswitch\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.617342 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615511 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/38f8cb42-d739-4806-98ed-206508f9cc9c-etc-kubernetes\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.617342 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615518 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.617342 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615534 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-var-lib-openvswitch\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.617342 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615593 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-cni-binary-copy\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.617342 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615574 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.617342 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615633 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/32f662eb-3388-4bc8-9550-6e567567f548-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dpf96\" (UID: \"32f662eb-3388-4bc8-9550-6e567567f548\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dpf96" Apr 17 11:16:33.617342 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615659 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/32f662eb-3388-4bc8-9550-6e567567f548-device-dir\") pod \"aws-ebs-csi-driver-node-dpf96\" (UID: \"32f662eb-3388-4bc8-9550-6e567567f548\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dpf96" Apr 17 11:16:33.617342 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615687 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vmbsb\" (UniqueName: \"kubernetes.io/projected/32f662eb-3388-4bc8-9550-6e567567f548-kube-api-access-vmbsb\") pod \"aws-ebs-csi-driver-node-dpf96\" (UID: \"32f662eb-3388-4bc8-9550-6e567567f548\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dpf96" Apr 17 11:16:33.617342 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615712 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/38f8cb42-d739-4806-98ed-206508f9cc9c-etc-modprobe-d\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.617342 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615737 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38f8cb42-d739-4806-98ed-206508f9cc9c-host\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.617342 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615768 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/38f8cb42-d739-4806-98ed-206508f9cc9c-etc-tuned\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.617342 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615794 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a3311f8e-8452-4224-8b40-1d0392b66a65-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-b92zr\" (UID: \"a3311f8e-8452-4224-8b40-1d0392b66a65\") " pod="openshift-multus/multus-additional-cni-plugins-b92zr" Apr 17 11:16:33.617342 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615838 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-multus-cni-dir\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.618076 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615867 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gmrng\" (UniqueName: \"kubernetes.io/projected/bee0bc88-7732-4010-9886-3df7384bf1c8-kube-api-access-gmrng\") pod \"network-metrics-daemon-9k5nh\" (UID: \"bee0bc88-7732-4010-9886-3df7384bf1c8\") " pod="openshift-multus/network-metrics-daemon-9k5nh" Apr 17 11:16:33.618076 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615898 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/38f8cb42-d739-4806-98ed-206508f9cc9c-lib-modules\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.618076 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615954 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-host-cni-netd\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.618076 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.615981 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-ovn-node-metrics-cert\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.618076 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.616122 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-cni-binary-copy\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.618076 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.616166 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/505d1e1b-000d-4203-b021-c56c5c5d8c56-iptables-alerter-script\") pod \"iptables-alerter-fvtct\" (UID: \"505d1e1b-000d-4203-b021-c56c5c5d8c56\") " pod="openshift-network-operator/iptables-alerter-fvtct" Apr 17 11:16:33.618076 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.616184 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a3311f8e-8452-4224-8b40-1d0392b66a65-cni-binary-copy\") pod \"multus-additional-cni-plugins-b92zr\" (UID: \"a3311f8e-8452-4224-8b40-1d0392b66a65\") " pod="openshift-multus/multus-additional-cni-plugins-b92zr" Apr 17 11:16:33.618076 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.616221 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/32f662eb-3388-4bc8-9550-6e567567f548-device-dir\") pod \"aws-ebs-csi-driver-node-dpf96\" (UID: \"32f662eb-3388-4bc8-9550-6e567567f548\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dpf96" Apr 17 11:16:33.618076 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.616280 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38f8cb42-d739-4806-98ed-206508f9cc9c-host\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.618076 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.616297 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/38f8cb42-d739-4806-98ed-206508f9cc9c-etc-modprobe-d\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.618076 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.616170 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a3311f8e-8452-4224-8b40-1d0392b66a65-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b92zr\" (UID: \"a3311f8e-8452-4224-8b40-1d0392b66a65\") " pod="openshift-multus/multus-additional-cni-plugins-b92zr" Apr 17 11:16:33.618076 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.616407 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/38f8cb42-d739-4806-98ed-206508f9cc9c-lib-modules\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.618076 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.616431 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/32f662eb-3388-4bc8-9550-6e567567f548-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dpf96\" (UID: \"32f662eb-3388-4bc8-9550-6e567567f548\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dpf96" Apr 17 11:16:33.618076 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.616472 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-host-cni-netd\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.618076 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.616476 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-multus-cni-dir\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.618076 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.616758 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a3311f8e-8452-4224-8b40-1d0392b66a65-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-b92zr\" (UID: \"a3311f8e-8452-4224-8b40-1d0392b66a65\") " pod="openshift-multus/multus-additional-cni-plugins-b92zr" Apr 17 11:16:33.618076 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.617932 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/38f8cb42-d739-4806-98ed-206508f9cc9c-tmp\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.618917 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.618253 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ba8cdaa8-b4aa-4821-9350-c03f9eb0b50d-agent-certs\") pod \"konnectivity-agent-qxx8v\" (UID: \"ba8cdaa8-b4aa-4821-9350-c03f9eb0b50d\") " pod="kube-system/konnectivity-agent-qxx8v" Apr 17 11:16:33.618917 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.618413 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/38f8cb42-d739-4806-98ed-206508f9cc9c-etc-tuned\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.618917 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.618610 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-ovn-node-metrics-cert\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.643345 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:33.643297 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:33.643345 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:33.643323 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:33.643345 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:33.643337 2581 projected.go:194] Error preparing data for projected volume kube-api-access-dbwzg for pod openshift-network-diagnostics/network-check-target-rdxhf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:33.643605 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:33.643402 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f-kube-api-access-dbwzg podName:1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f nodeName:}" failed. No retries permitted until 2026-04-17 11:16:34.143384878 +0000 UTC m=+3.085539099 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-dbwzg" (UniqueName: "kubernetes.io/projected/1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f-kube-api-access-dbwzg") pod "network-check-target-rdxhf" (UID: "1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:33.645792 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.645766 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2mxf\" (UniqueName: \"kubernetes.io/projected/4ab6c44c-fe99-4f3a-a19e-93959e1d3d56-kube-api-access-s2mxf\") pod \"ovnkube-node-q4q4t\" (UID: \"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.646071 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.646051 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzdml\" (UniqueName: \"kubernetes.io/projected/a3311f8e-8452-4224-8b40-1d0392b66a65-kube-api-access-tzdml\") pod \"multus-additional-cni-plugins-b92zr\" (UID: \"a3311f8e-8452-4224-8b40-1d0392b66a65\") " pod="openshift-multus/multus-additional-cni-plugins-b92zr" Apr 17 11:16:33.646433 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.646411 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ld8c\" (UniqueName: \"kubernetes.io/projected/2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b-kube-api-access-5ld8c\") pod \"multus-szct7\" (UID: \"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b\") " pod="openshift-multus/multus-szct7" Apr 17 11:16:33.647721 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.647698 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvc4x\" (UniqueName: \"kubernetes.io/projected/505d1e1b-000d-4203-b021-c56c5c5d8c56-kube-api-access-cvc4x\") pod \"iptables-alerter-fvtct\" (UID: \"505d1e1b-000d-4203-b021-c56c5c5d8c56\") " pod="openshift-network-operator/iptables-alerter-fvtct" Apr 17 11:16:33.650357 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.650334 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nqpn\" (UniqueName: \"kubernetes.io/projected/5ee7c1ba-cbb5-4b2b-b2f7-23d447b10b96-kube-api-access-8nqpn\") pod \"node-resolver-8mkn6\" (UID: \"5ee7c1ba-cbb5-4b2b-b2f7-23d447b10b96\") " pod="openshift-dns/node-resolver-8mkn6" Apr 17 11:16:33.650613 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.650588 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnxzl\" (UniqueName: \"kubernetes.io/projected/3b8f2274-31f6-413e-944a-132f5e8db8f6-kube-api-access-fnxzl\") pod \"node-ca-sdhv4\" (UID: \"3b8f2274-31f6-413e-944a-132f5e8db8f6\") " pod="openshift-image-registry/node-ca-sdhv4" Apr 17 11:16:33.650708 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.650615 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmbsb\" (UniqueName: \"kubernetes.io/projected/32f662eb-3388-4bc8-9550-6e567567f548-kube-api-access-vmbsb\") pod \"aws-ebs-csi-driver-node-dpf96\" (UID: \"32f662eb-3388-4bc8-9550-6e567567f548\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dpf96" Apr 17 11:16:33.651398 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.651374 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmrng\" (UniqueName: \"kubernetes.io/projected/bee0bc88-7732-4010-9886-3df7384bf1c8-kube-api-access-gmrng\") pod \"network-metrics-daemon-9k5nh\" (UID: \"bee0bc88-7732-4010-9886-3df7384bf1c8\") " pod="openshift-multus/network-metrics-daemon-9k5nh" Apr 17 11:16:33.651572 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.651555 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dxjv\" (UniqueName: \"kubernetes.io/projected/38f8cb42-d739-4806-98ed-206508f9cc9c-kube-api-access-5dxjv\") pod \"tuned-ldnwz\" (UID: \"38f8cb42-d739-4806-98ed-206508f9cc9c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.664522 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.663793 2581 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:33.793245 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.793159 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-t82fh"] Apr 17 11:16:33.793245 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.793218 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sdhv4" Apr 17 11:16:33.796316 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.796295 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t82fh" Apr 17 11:16:33.796435 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:33.796383 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t82fh" podUID="09f10455-02ae-4c95-91a9-6c0b6af2b02f" Apr 17 11:16:33.799627 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.799605 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" Apr 17 11:16:33.809263 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.809239 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8mkn6" Apr 17 11:16:33.812969 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.812950 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:33.817560 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.817540 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/09f10455-02ae-4c95-91a9-6c0b6af2b02f-original-pull-secret\") pod \"global-pull-secret-syncer-t82fh\" (UID: \"09f10455-02ae-4c95-91a9-6c0b6af2b02f\") " pod="kube-system/global-pull-secret-syncer-t82fh" Apr 17 11:16:33.817659 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.817579 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/09f10455-02ae-4c95-91a9-6c0b6af2b02f-kubelet-config\") pod \"global-pull-secret-syncer-t82fh\" (UID: \"09f10455-02ae-4c95-91a9-6c0b6af2b02f\") " pod="kube-system/global-pull-secret-syncer-t82fh" Apr 17 11:16:33.817719 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.817688 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/09f10455-02ae-4c95-91a9-6c0b6af2b02f-dbus\") pod \"global-pull-secret-syncer-t82fh\" (UID: \"09f10455-02ae-4c95-91a9-6c0b6af2b02f\") " pod="kube-system/global-pull-secret-syncer-t82fh" Apr 17 11:16:33.819571 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.819554 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-b92zr" Apr 17 11:16:33.827214 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.827195 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-szct7" Apr 17 11:16:33.834656 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.834637 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-fvtct" Apr 17 11:16:33.840776 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.840755 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-qxx8v" Apr 17 11:16:33.846326 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.846309 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dpf96" Apr 17 11:16:33.918043 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.917998 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/09f10455-02ae-4c95-91a9-6c0b6af2b02f-dbus\") pod \"global-pull-secret-syncer-t82fh\" (UID: \"09f10455-02ae-4c95-91a9-6c0b6af2b02f\") " pod="kube-system/global-pull-secret-syncer-t82fh" Apr 17 11:16:33.918215 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.918061 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/09f10455-02ae-4c95-91a9-6c0b6af2b02f-original-pull-secret\") pod \"global-pull-secret-syncer-t82fh\" (UID: \"09f10455-02ae-4c95-91a9-6c0b6af2b02f\") " pod="kube-system/global-pull-secret-syncer-t82fh" Apr 17 11:16:33.918215 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.918105 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/09f10455-02ae-4c95-91a9-6c0b6af2b02f-kubelet-config\") pod \"global-pull-secret-syncer-t82fh\" (UID: \"09f10455-02ae-4c95-91a9-6c0b6af2b02f\") " pod="kube-system/global-pull-secret-syncer-t82fh" Apr 17 11:16:33.918215 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.918199 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/09f10455-02ae-4c95-91a9-6c0b6af2b02f-dbus\") pod \"global-pull-secret-syncer-t82fh\" (UID: \"09f10455-02ae-4c95-91a9-6c0b6af2b02f\") " pod="kube-system/global-pull-secret-syncer-t82fh" Apr 17 11:16:33.918343 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:33.918241 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/09f10455-02ae-4c95-91a9-6c0b6af2b02f-kubelet-config\") pod \"global-pull-secret-syncer-t82fh\" (UID: \"09f10455-02ae-4c95-91a9-6c0b6af2b02f\") " pod="kube-system/global-pull-secret-syncer-t82fh" Apr 17 11:16:33.918343 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:33.918269 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:33.918418 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:33.918356 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09f10455-02ae-4c95-91a9-6c0b6af2b02f-original-pull-secret podName:09f10455-02ae-4c95-91a9-6c0b6af2b02f nodeName:}" failed. No retries permitted until 2026-04-17 11:16:34.418334633 +0000 UTC m=+3.360488855 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/09f10455-02ae-4c95-91a9-6c0b6af2b02f-original-pull-secret") pod "global-pull-secret-syncer-t82fh" (UID: "09f10455-02ae-4c95-91a9-6c0b6af2b02f") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:34.119782 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:34.119742 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bee0bc88-7732-4010-9886-3df7384bf1c8-metrics-certs\") pod \"network-metrics-daemon-9k5nh\" (UID: \"bee0bc88-7732-4010-9886-3df7384bf1c8\") " pod="openshift-multus/network-metrics-daemon-9k5nh" Apr 17 11:16:34.119948 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:34.119917 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:34.120019 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:34.119992 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bee0bc88-7732-4010-9886-3df7384bf1c8-metrics-certs podName:bee0bc88-7732-4010-9886-3df7384bf1c8 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:35.119970678 +0000 UTC m=+4.062124897 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bee0bc88-7732-4010-9886-3df7384bf1c8-metrics-certs") pod "network-metrics-daemon-9k5nh" (UID: "bee0bc88-7732-4010-9886-3df7384bf1c8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:34.220294 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:34.220262 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dbwzg\" (UniqueName: \"kubernetes.io/projected/1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f-kube-api-access-dbwzg\") pod \"network-check-target-rdxhf\" (UID: \"1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f\") " pod="openshift-network-diagnostics/network-check-target-rdxhf" Apr 17 11:16:34.220468 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:34.220448 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:34.220543 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:34.220475 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:34.220543 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:34.220488 2581 projected.go:194] Error preparing data for projected volume kube-api-access-dbwzg for pod openshift-network-diagnostics/network-check-target-rdxhf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:34.220647 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:34.220553 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f-kube-api-access-dbwzg podName:1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f nodeName:}" failed. No retries permitted until 2026-04-17 11:16:35.220533837 +0000 UTC m=+4.162688074 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-dbwzg" (UniqueName: "kubernetes.io/projected/1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f-kube-api-access-dbwzg") pod "network-check-target-rdxhf" (UID: "1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:34.230671 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:34.230635 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3311f8e_8452_4224_8b40_1d0392b66a65.slice/crio-18462bc6ba6e71389b25487a3ab94b4ea67a41301b98513303cae307dc7e802f WatchSource:0}: Error finding container 18462bc6ba6e71389b25487a3ab94b4ea67a41301b98513303cae307dc7e802f: Status 404 returned error can't find the container with id 18462bc6ba6e71389b25487a3ab94b4ea67a41301b98513303cae307dc7e802f Apr 17 11:16:34.231612 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:34.231586 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32f662eb_3388_4bc8_9550_6e567567f548.slice/crio-4077cbd1d6129e1d00f97eb0c547c91d95e98c6c5e3a26ab40452a751edf0d86 WatchSource:0}: Error finding container 4077cbd1d6129e1d00f97eb0c547c91d95e98c6c5e3a26ab40452a751edf0d86: Status 404 returned error can't find the container with id 4077cbd1d6129e1d00f97eb0c547c91d95e98c6c5e3a26ab40452a751edf0d86 Apr 17 11:16:34.233554 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:34.233229 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d3b1e8c_4d88_46ef_95e8_c7034cf6ec2b.slice/crio-7f9f4e7a52fab28974b30da33bdd40738698c090f54dddfcf53a6829911883a0 WatchSource:0}: Error finding container 7f9f4e7a52fab28974b30da33bdd40738698c090f54dddfcf53a6829911883a0: Status 404 returned error can't find the container with id 7f9f4e7a52fab28974b30da33bdd40738698c090f54dddfcf53a6829911883a0 Apr 17 11:16:34.233805 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:34.233782 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba8cdaa8_b4aa_4821_9350_c03f9eb0b50d.slice/crio-a4dad3402a7c3f032b2d2f685b9630648126f29ca23fd8fc426773fa64b1006e WatchSource:0}: Error finding container a4dad3402a7c3f032b2d2f685b9630648126f29ca23fd8fc426773fa64b1006e: Status 404 returned error can't find the container with id a4dad3402a7c3f032b2d2f685b9630648126f29ca23fd8fc426773fa64b1006e Apr 17 11:16:34.237404 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:34.237386 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b8f2274_31f6_413e_944a_132f5e8db8f6.slice/crio-6702b349d07d62095aa4aa6a5b84f58520542a72ed3d3f71fd3ed47e06d21a09 WatchSource:0}: Error finding container 6702b349d07d62095aa4aa6a5b84f58520542a72ed3d3f71fd3ed47e06d21a09: Status 404 returned error can't find the container with id 6702b349d07d62095aa4aa6a5b84f58520542a72ed3d3f71fd3ed47e06d21a09 Apr 17 11:16:34.238883 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:34.238862 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ab6c44c_fe99_4f3a_a19e_93959e1d3d56.slice/crio-b98ee86bef4fa750965256f19c7c1b33d65d13260b380ca50ac010f2d89e99e6 WatchSource:0}: Error finding container b98ee86bef4fa750965256f19c7c1b33d65d13260b380ca50ac010f2d89e99e6: Status 404 returned error can't find the container with id b98ee86bef4fa750965256f19c7c1b33d65d13260b380ca50ac010f2d89e99e6 Apr 17 11:16:34.239361 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:34.239279 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod505d1e1b_000d_4203_b021_c56c5c5d8c56.slice/crio-d1ff79be4ae24537c90d6c01a2fafbf27ec751258a9da8283aef04652d7dcbfd WatchSource:0}: Error finding container d1ff79be4ae24537c90d6c01a2fafbf27ec751258a9da8283aef04652d7dcbfd: Status 404 returned error can't find the container with id d1ff79be4ae24537c90d6c01a2fafbf27ec751258a9da8283aef04652d7dcbfd Apr 17 11:16:34.240309 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:34.240257 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38f8cb42_d739_4806_98ed_206508f9cc9c.slice/crio-335b7020603caba7e54495c59da7df7bd461d1f60e75b52eb7633432aad4c1bc WatchSource:0}: Error finding container 335b7020603caba7e54495c59da7df7bd461d1f60e75b52eb7633432aad4c1bc: Status 404 returned error can't find the container with id 335b7020603caba7e54495c59da7df7bd461d1f60e75b52eb7633432aad4c1bc Apr 17 11:16:34.240974 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:16:34.240788 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ee7c1ba_cbb5_4b2b_b2f7_23d447b10b96.slice/crio-2f969e527e4a276d7771d833e8e1f57dec729013602138ce2b35e3f2a608fb90 WatchSource:0}: Error finding container 2f969e527e4a276d7771d833e8e1f57dec729013602138ce2b35e3f2a608fb90: Status 404 returned error can't find the container with id 2f969e527e4a276d7771d833e8e1f57dec729013602138ce2b35e3f2a608fb90 Apr 17 11:16:34.421944 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:34.421686 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/09f10455-02ae-4c95-91a9-6c0b6af2b02f-original-pull-secret\") pod \"global-pull-secret-syncer-t82fh\" (UID: \"09f10455-02ae-4c95-91a9-6c0b6af2b02f\") " pod="kube-system/global-pull-secret-syncer-t82fh" Apr 17 11:16:34.421944 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:34.421834 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:34.422143 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:34.421984 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09f10455-02ae-4c95-91a9-6c0b6af2b02f-original-pull-secret podName:09f10455-02ae-4c95-91a9-6c0b6af2b02f nodeName:}" failed. No retries permitted until 2026-04-17 11:16:35.421969724 +0000 UTC m=+4.364123944 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/09f10455-02ae-4c95-91a9-6c0b6af2b02f-original-pull-secret") pod "global-pull-secret-syncer-t82fh" (UID: "09f10455-02ae-4c95-91a9-6c0b6af2b02f") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:34.614591 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:34.614525 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 11:11:32 +0000 UTC" deadline="2027-10-21 09:03:10.599090122 +0000 UTC" Apr 17 11:16:34.614591 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:34.614567 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13245h46m35.984527259s" Apr 17 11:16:34.680447 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:34.680362 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8mkn6" event={"ID":"5ee7c1ba-cbb5-4b2b-b2f7-23d447b10b96","Type":"ContainerStarted","Data":"2f969e527e4a276d7771d833e8e1f57dec729013602138ce2b35e3f2a608fb90"} Apr 17 11:16:34.684622 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:34.684590 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-fvtct" event={"ID":"505d1e1b-000d-4203-b021-c56c5c5d8c56","Type":"ContainerStarted","Data":"d1ff79be4ae24537c90d6c01a2fafbf27ec751258a9da8283aef04652d7dcbfd"} Apr 17 11:16:34.689234 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:34.689181 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" event={"ID":"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56","Type":"ContainerStarted","Data":"b98ee86bef4fa750965256f19c7c1b33d65d13260b380ca50ac010f2d89e99e6"} Apr 17 11:16:34.691398 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:34.691350 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sdhv4" event={"ID":"3b8f2274-31f6-413e-944a-132f5e8db8f6","Type":"ContainerStarted","Data":"6702b349d07d62095aa4aa6a5b84f58520542a72ed3d3f71fd3ed47e06d21a09"} Apr 17 11:16:34.700186 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:34.700119 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-qxx8v" event={"ID":"ba8cdaa8-b4aa-4821-9350-c03f9eb0b50d","Type":"ContainerStarted","Data":"a4dad3402a7c3f032b2d2f685b9630648126f29ca23fd8fc426773fa64b1006e"} Apr 17 11:16:34.705951 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:34.705914 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" event={"ID":"38f8cb42-d739-4806-98ed-206508f9cc9c","Type":"ContainerStarted","Data":"335b7020603caba7e54495c59da7df7bd461d1f60e75b52eb7633432aad4c1bc"} Apr 17 11:16:34.709987 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:34.709937 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-szct7" event={"ID":"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b","Type":"ContainerStarted","Data":"7f9f4e7a52fab28974b30da33bdd40738698c090f54dddfcf53a6829911883a0"} Apr 17 11:16:34.713548 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:34.713495 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dpf96" event={"ID":"32f662eb-3388-4bc8-9550-6e567567f548","Type":"ContainerStarted","Data":"4077cbd1d6129e1d00f97eb0c547c91d95e98c6c5e3a26ab40452a751edf0d86"} Apr 17 11:16:34.717604 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:34.717556 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b92zr" event={"ID":"a3311f8e-8452-4224-8b40-1d0392b66a65","Type":"ContainerStarted","Data":"18462bc6ba6e71389b25487a3ab94b4ea67a41301b98513303cae307dc7e802f"} Apr 17 11:16:34.724509 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:34.723979 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-177.ec2.internal" event={"ID":"f24d5b8e4287d839ebd797e520397e0b","Type":"ContainerStarted","Data":"e7202c4ec3e05126ac5d3f2e8c20d546e09d0f0e9fa7bcfe9bcec3d10c376d5c"} Apr 17 11:16:35.129279 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:35.129223 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bee0bc88-7732-4010-9886-3df7384bf1c8-metrics-certs\") pod \"network-metrics-daemon-9k5nh\" (UID: \"bee0bc88-7732-4010-9886-3df7384bf1c8\") " pod="openshift-multus/network-metrics-daemon-9k5nh" Apr 17 11:16:35.129458 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:35.129371 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:35.129458 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:35.129439 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bee0bc88-7732-4010-9886-3df7384bf1c8-metrics-certs podName:bee0bc88-7732-4010-9886-3df7384bf1c8 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:37.129419678 +0000 UTC m=+6.071573898 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bee0bc88-7732-4010-9886-3df7384bf1c8-metrics-certs") pod "network-metrics-daemon-9k5nh" (UID: "bee0bc88-7732-4010-9886-3df7384bf1c8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:35.230841 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:35.230200 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dbwzg\" (UniqueName: \"kubernetes.io/projected/1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f-kube-api-access-dbwzg\") pod \"network-check-target-rdxhf\" (UID: \"1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f\") " pod="openshift-network-diagnostics/network-check-target-rdxhf" Apr 17 11:16:35.230841 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:35.230379 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:35.230841 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:35.230399 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:35.230841 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:35.230412 2581 projected.go:194] Error preparing data for projected volume kube-api-access-dbwzg for pod openshift-network-diagnostics/network-check-target-rdxhf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:35.230841 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:35.230474 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f-kube-api-access-dbwzg podName:1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f nodeName:}" failed. No retries permitted until 2026-04-17 11:16:37.230455442 +0000 UTC m=+6.172609686 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-dbwzg" (UniqueName: "kubernetes.io/projected/1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f-kube-api-access-dbwzg") pod "network-check-target-rdxhf" (UID: "1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:35.432707 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:35.432669 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/09f10455-02ae-4c95-91a9-6c0b6af2b02f-original-pull-secret\") pod \"global-pull-secret-syncer-t82fh\" (UID: \"09f10455-02ae-4c95-91a9-6c0b6af2b02f\") " pod="kube-system/global-pull-secret-syncer-t82fh" Apr 17 11:16:35.432888 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:35.432868 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:35.432981 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:35.432936 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09f10455-02ae-4c95-91a9-6c0b6af2b02f-original-pull-secret podName:09f10455-02ae-4c95-91a9-6c0b6af2b02f nodeName:}" failed. No retries permitted until 2026-04-17 11:16:37.432918486 +0000 UTC m=+6.375072731 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/09f10455-02ae-4c95-91a9-6c0b6af2b02f-original-pull-secret") pod "global-pull-secret-syncer-t82fh" (UID: "09f10455-02ae-4c95-91a9-6c0b6af2b02f") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:35.657846 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:35.657750 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9k5nh" Apr 17 11:16:35.658302 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:35.657902 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9k5nh" podUID="bee0bc88-7732-4010-9886-3df7384bf1c8" Apr 17 11:16:35.658302 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:35.657913 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rdxhf" Apr 17 11:16:35.658302 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:35.658013 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rdxhf" podUID="1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f" Apr 17 11:16:35.658464 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:35.658396 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t82fh" Apr 17 11:16:35.661098 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:35.658501 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t82fh" podUID="09f10455-02ae-4c95-91a9-6c0b6af2b02f" Apr 17 11:16:35.737786 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:35.737625 2581 generic.go:358] "Generic (PLEG): container finished" podID="badb9c2bed2c4be58a66e5c2f6172037" containerID="46a9f662486481687b22e2fffb113ec40e230c6e424c8a3357fc8e8c875b75d2" exitCode=0 Apr 17 11:16:35.737786 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:35.737709 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-177.ec2.internal" event={"ID":"badb9c2bed2c4be58a66e5c2f6172037","Type":"ContainerDied","Data":"46a9f662486481687b22e2fffb113ec40e230c6e424c8a3357fc8e8c875b75d2"} Apr 17 11:16:35.755685 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:35.755632 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-177.ec2.internal" podStartSLOduration=3.755615851 podStartE2EDuration="3.755615851s" podCreationTimestamp="2026-04-17 11:16:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:16:34.740061379 +0000 UTC m=+3.682215621" watchObservedRunningTime="2026-04-17 11:16:35.755615851 +0000 UTC m=+4.697770093" Apr 17 11:16:36.743170 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:36.742973 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-177.ec2.internal" event={"ID":"badb9c2bed2c4be58a66e5c2f6172037","Type":"ContainerStarted","Data":"079ccfb83df75768b5e0df9ac64b02c156156517918933e9cb0a8c7a24061efd"} Apr 17 11:16:36.759187 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:36.759136 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-177.ec2.internal" podStartSLOduration=4.759118812 podStartE2EDuration="4.759118812s" podCreationTimestamp="2026-04-17 11:16:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:16:36.759070591 +0000 UTC m=+5.701224835" watchObservedRunningTime="2026-04-17 11:16:36.759118812 +0000 UTC m=+5.701273055" Apr 17 11:16:37.151185 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:37.150615 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bee0bc88-7732-4010-9886-3df7384bf1c8-metrics-certs\") pod \"network-metrics-daemon-9k5nh\" (UID: \"bee0bc88-7732-4010-9886-3df7384bf1c8\") " pod="openshift-multus/network-metrics-daemon-9k5nh" Apr 17 11:16:37.151185 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:37.150754 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:37.151185 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:37.150834 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bee0bc88-7732-4010-9886-3df7384bf1c8-metrics-certs podName:bee0bc88-7732-4010-9886-3df7384bf1c8 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:41.150802953 +0000 UTC m=+10.092957175 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bee0bc88-7732-4010-9886-3df7384bf1c8-metrics-certs") pod "network-metrics-daemon-9k5nh" (UID: "bee0bc88-7732-4010-9886-3df7384bf1c8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:37.252040 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:37.251989 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dbwzg\" (UniqueName: \"kubernetes.io/projected/1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f-kube-api-access-dbwzg\") pod \"network-check-target-rdxhf\" (UID: \"1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f\") " pod="openshift-network-diagnostics/network-check-target-rdxhf" Apr 17 11:16:37.252207 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:37.252170 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:37.252207 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:37.252189 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:37.252207 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:37.252200 2581 projected.go:194] Error preparing data for projected volume kube-api-access-dbwzg for pod openshift-network-diagnostics/network-check-target-rdxhf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:37.252357 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:37.252263 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f-kube-api-access-dbwzg podName:1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f nodeName:}" failed. No retries permitted until 2026-04-17 11:16:41.252243945 +0000 UTC m=+10.194398167 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-dbwzg" (UniqueName: "kubernetes.io/projected/1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f-kube-api-access-dbwzg") pod "network-check-target-rdxhf" (UID: "1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:37.454486 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:37.454390 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/09f10455-02ae-4c95-91a9-6c0b6af2b02f-original-pull-secret\") pod \"global-pull-secret-syncer-t82fh\" (UID: \"09f10455-02ae-4c95-91a9-6c0b6af2b02f\") " pod="kube-system/global-pull-secret-syncer-t82fh" Apr 17 11:16:37.454657 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:37.454593 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:37.454725 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:37.454657 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09f10455-02ae-4c95-91a9-6c0b6af2b02f-original-pull-secret podName:09f10455-02ae-4c95-91a9-6c0b6af2b02f nodeName:}" failed. No retries permitted until 2026-04-17 11:16:41.454637624 +0000 UTC m=+10.396791854 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/09f10455-02ae-4c95-91a9-6c0b6af2b02f-original-pull-secret") pod "global-pull-secret-syncer-t82fh" (UID: "09f10455-02ae-4c95-91a9-6c0b6af2b02f") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:37.657512 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:37.657481 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t82fh" Apr 17 11:16:37.657698 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:37.657614 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t82fh" podUID="09f10455-02ae-4c95-91a9-6c0b6af2b02f" Apr 17 11:16:37.657977 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:37.657497 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9k5nh" Apr 17 11:16:37.658110 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:37.658084 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9k5nh" podUID="bee0bc88-7732-4010-9886-3df7384bf1c8" Apr 17 11:16:37.658187 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:37.658140 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rdxhf" Apr 17 11:16:37.658289 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:37.658240 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rdxhf" podUID="1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f" Apr 17 11:16:39.657609 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:39.657121 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9k5nh" Apr 17 11:16:39.657609 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:39.657263 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9k5nh" podUID="bee0bc88-7732-4010-9886-3df7384bf1c8" Apr 17 11:16:39.657609 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:39.657351 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t82fh" Apr 17 11:16:39.657609 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:39.657457 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t82fh" podUID="09f10455-02ae-4c95-91a9-6c0b6af2b02f" Apr 17 11:16:39.657609 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:39.657498 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rdxhf" Apr 17 11:16:39.657609 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:39.657550 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rdxhf" podUID="1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f" Apr 17 11:16:41.188386 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:41.188338 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bee0bc88-7732-4010-9886-3df7384bf1c8-metrics-certs\") pod \"network-metrics-daemon-9k5nh\" (UID: \"bee0bc88-7732-4010-9886-3df7384bf1c8\") " pod="openshift-multus/network-metrics-daemon-9k5nh" Apr 17 11:16:41.188782 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:41.188540 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:41.188782 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:41.188622 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bee0bc88-7732-4010-9886-3df7384bf1c8-metrics-certs podName:bee0bc88-7732-4010-9886-3df7384bf1c8 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:49.188599204 +0000 UTC m=+18.130753424 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bee0bc88-7732-4010-9886-3df7384bf1c8-metrics-certs") pod "network-metrics-daemon-9k5nh" (UID: "bee0bc88-7732-4010-9886-3df7384bf1c8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:41.289246 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:41.288896 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dbwzg\" (UniqueName: \"kubernetes.io/projected/1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f-kube-api-access-dbwzg\") pod \"network-check-target-rdxhf\" (UID: \"1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f\") " pod="openshift-network-diagnostics/network-check-target-rdxhf" Apr 17 11:16:41.289246 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:41.289142 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:41.289246 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:41.289162 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:41.289246 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:41.289174 2581 projected.go:194] Error preparing data for projected volume kube-api-access-dbwzg for pod openshift-network-diagnostics/network-check-target-rdxhf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:41.289246 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:41.289239 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f-kube-api-access-dbwzg podName:1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f nodeName:}" failed. No retries permitted until 2026-04-17 11:16:49.289219633 +0000 UTC m=+18.231373855 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-dbwzg" (UniqueName: "kubernetes.io/projected/1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f-kube-api-access-dbwzg") pod "network-check-target-rdxhf" (UID: "1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:41.491269 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:41.491176 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/09f10455-02ae-4c95-91a9-6c0b6af2b02f-original-pull-secret\") pod \"global-pull-secret-syncer-t82fh\" (UID: \"09f10455-02ae-4c95-91a9-6c0b6af2b02f\") " pod="kube-system/global-pull-secret-syncer-t82fh" Apr 17 11:16:41.491441 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:41.491341 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:41.491441 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:41.491425 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09f10455-02ae-4c95-91a9-6c0b6af2b02f-original-pull-secret podName:09f10455-02ae-4c95-91a9-6c0b6af2b02f nodeName:}" failed. No retries permitted until 2026-04-17 11:16:49.491403033 +0000 UTC m=+18.433557257 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/09f10455-02ae-4c95-91a9-6c0b6af2b02f-original-pull-secret") pod "global-pull-secret-syncer-t82fh" (UID: "09f10455-02ae-4c95-91a9-6c0b6af2b02f") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:41.658018 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:41.657981 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9k5nh" Apr 17 11:16:41.658171 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:41.658108 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9k5nh" podUID="bee0bc88-7732-4010-9886-3df7384bf1c8" Apr 17 11:16:41.658376 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:41.658347 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rdxhf" Apr 17 11:16:41.658520 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:41.658459 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rdxhf" podUID="1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f" Apr 17 11:16:41.658520 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:41.658504 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t82fh" Apr 17 11:16:41.658635 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:41.658577 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t82fh" podUID="09f10455-02ae-4c95-91a9-6c0b6af2b02f" Apr 17 11:16:43.657725 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:43.657695 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9k5nh" Apr 17 11:16:43.658164 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:43.657695 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rdxhf" Apr 17 11:16:43.658164 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:43.657695 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t82fh" Apr 17 11:16:43.658164 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:43.657848 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9k5nh" podUID="bee0bc88-7732-4010-9886-3df7384bf1c8" Apr 17 11:16:43.658164 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:43.657930 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rdxhf" podUID="1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f" Apr 17 11:16:43.658164 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:43.658012 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t82fh" podUID="09f10455-02ae-4c95-91a9-6c0b6af2b02f" Apr 17 11:16:45.658105 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:45.658071 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rdxhf" Apr 17 11:16:45.658678 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:45.658192 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rdxhf" podUID="1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f" Apr 17 11:16:45.658678 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:45.658217 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t82fh" Apr 17 11:16:45.658678 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:45.658077 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9k5nh" Apr 17 11:16:45.658678 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:45.658323 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t82fh" podUID="09f10455-02ae-4c95-91a9-6c0b6af2b02f" Apr 17 11:16:45.658678 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:45.658410 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9k5nh" podUID="bee0bc88-7732-4010-9886-3df7384bf1c8" Apr 17 11:16:47.656947 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:47.656915 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9k5nh" Apr 17 11:16:47.657384 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:47.656915 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t82fh" Apr 17 11:16:47.657384 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:47.657055 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rdxhf" Apr 17 11:16:47.657384 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:47.657050 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9k5nh" podUID="bee0bc88-7732-4010-9886-3df7384bf1c8" Apr 17 11:16:47.657384 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:47.657133 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t82fh" podUID="09f10455-02ae-4c95-91a9-6c0b6af2b02f" Apr 17 11:16:47.657384 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:47.657209 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rdxhf" podUID="1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f" Apr 17 11:16:49.245331 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:49.245302 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bee0bc88-7732-4010-9886-3df7384bf1c8-metrics-certs\") pod \"network-metrics-daemon-9k5nh\" (UID: \"bee0bc88-7732-4010-9886-3df7384bf1c8\") " pod="openshift-multus/network-metrics-daemon-9k5nh" Apr 17 11:16:49.245781 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:49.245461 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:49.245781 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:49.245538 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bee0bc88-7732-4010-9886-3df7384bf1c8-metrics-certs podName:bee0bc88-7732-4010-9886-3df7384bf1c8 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:05.245517346 +0000 UTC m=+34.187671575 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bee0bc88-7732-4010-9886-3df7384bf1c8-metrics-certs") pod "network-metrics-daemon-9k5nh" (UID: "bee0bc88-7732-4010-9886-3df7384bf1c8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:49.346677 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:49.346637 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dbwzg\" (UniqueName: \"kubernetes.io/projected/1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f-kube-api-access-dbwzg\") pod \"network-check-target-rdxhf\" (UID: \"1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f\") " pod="openshift-network-diagnostics/network-check-target-rdxhf" Apr 17 11:16:49.346876 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:49.346787 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:49.346876 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:49.346804 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:49.346876 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:49.346833 2581 projected.go:194] Error preparing data for projected volume kube-api-access-dbwzg for pod openshift-network-diagnostics/network-check-target-rdxhf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:49.347028 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:49.346898 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f-kube-api-access-dbwzg podName:1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f nodeName:}" failed. No retries permitted until 2026-04-17 11:17:05.346878763 +0000 UTC m=+34.289032986 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-dbwzg" (UniqueName: "kubernetes.io/projected/1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f-kube-api-access-dbwzg") pod "network-check-target-rdxhf" (UID: "1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:49.548238 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:49.548144 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/09f10455-02ae-4c95-91a9-6c0b6af2b02f-original-pull-secret\") pod \"global-pull-secret-syncer-t82fh\" (UID: \"09f10455-02ae-4c95-91a9-6c0b6af2b02f\") " pod="kube-system/global-pull-secret-syncer-t82fh" Apr 17 11:16:49.548389 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:49.548312 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:49.548389 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:49.548387 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09f10455-02ae-4c95-91a9-6c0b6af2b02f-original-pull-secret podName:09f10455-02ae-4c95-91a9-6c0b6af2b02f nodeName:}" failed. No retries permitted until 2026-04-17 11:17:05.54837183 +0000 UTC m=+34.490526048 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/09f10455-02ae-4c95-91a9-6c0b6af2b02f-original-pull-secret") pod "global-pull-secret-syncer-t82fh" (UID: "09f10455-02ae-4c95-91a9-6c0b6af2b02f") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:49.657426 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:49.657389 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9k5nh" Apr 17 11:16:49.657579 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:49.657437 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t82fh" Apr 17 11:16:49.657579 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:49.657487 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rdxhf" Apr 17 11:16:49.657690 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:49.657623 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9k5nh" podUID="bee0bc88-7732-4010-9886-3df7384bf1c8" Apr 17 11:16:49.657740 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:49.657683 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t82fh" podUID="09f10455-02ae-4c95-91a9-6c0b6af2b02f" Apr 17 11:16:49.657796 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:49.657747 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rdxhf" podUID="1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f" Apr 17 11:16:51.658772 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:51.657316 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9k5nh" Apr 17 11:16:51.658772 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:51.657317 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t82fh" Apr 17 11:16:51.658772 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:51.658402 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9k5nh" podUID="bee0bc88-7732-4010-9886-3df7384bf1c8" Apr 17 11:16:51.658772 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:51.658461 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t82fh" podUID="09f10455-02ae-4c95-91a9-6c0b6af2b02f" Apr 17 11:16:51.658772 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:51.658512 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rdxhf" Apr 17 11:16:51.658772 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:51.658582 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rdxhf" podUID="1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f" Apr 17 11:16:51.770025 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:51.769968 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" event={"ID":"38f8cb42-d739-4806-98ed-206508f9cc9c","Type":"ContainerStarted","Data":"a313d84f421de3477a01924217da2b75cd551ab6f682f906e15b37c8dff02c1a"} Apr 17 11:16:51.771466 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:51.771435 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-szct7" event={"ID":"2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b","Type":"ContainerStarted","Data":"c6e43a22f36d6028c08fe8a63b6b89df8c28f3dad0130c3c2103e46ae8a00e6a"} Apr 17 11:16:51.773252 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:51.773228 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dpf96" event={"ID":"32f662eb-3388-4bc8-9550-6e567567f548","Type":"ContainerStarted","Data":"69b4d7523cc0a6c847e02b68cece7d79618186456fa7d7227e7d89dce6682687"} Apr 17 11:16:51.774392 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:51.774325 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b92zr" event={"ID":"a3311f8e-8452-4224-8b40-1d0392b66a65","Type":"ContainerStarted","Data":"9425cd9e7ff81132131eb419d68593837a03f4144b62331eb638f6c0fedf8f1d"} Apr 17 11:16:51.775875 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:51.775852 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8mkn6" event={"ID":"5ee7c1ba-cbb5-4b2b-b2f7-23d447b10b96","Type":"ContainerStarted","Data":"736daf964da4a72a0dce09a50b83d4e235784f56ccde183854d9441c741cea10"} Apr 17 11:16:51.777219 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:51.777201 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" event={"ID":"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56","Type":"ContainerStarted","Data":"c91d58c7e6c1767a7c8608b43129c3f268e242c302697f3eecc26ab4e6d66901"} Apr 17 11:16:51.778249 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:51.778232 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sdhv4" event={"ID":"3b8f2274-31f6-413e-944a-132f5e8db8f6","Type":"ContainerStarted","Data":"09624fcc06ed66a1eb874bd45dfc749ff23bb717c55b54e9b2e5bd54fc1ae2b8"} Apr 17 11:16:51.779559 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:51.779539 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-qxx8v" event={"ID":"ba8cdaa8-b4aa-4821-9350-c03f9eb0b50d","Type":"ContainerStarted","Data":"37a2d08719e80aa8fbf5cbc9102b0328bced08f7b2516ff94acc963ca2d51fe0"} Apr 17 11:16:51.791728 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:51.791686 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-ldnwz" podStartSLOduration=3.6272313929999997 podStartE2EDuration="20.791652568s" podCreationTimestamp="2026-04-17 11:16:31 +0000 UTC" firstStartedPulling="2026-04-17 11:16:34.242950061 +0000 UTC m=+3.185104279" lastFinishedPulling="2026-04-17 11:16:51.407371224 +0000 UTC m=+20.349525454" observedRunningTime="2026-04-17 11:16:51.791006865 +0000 UTC m=+20.733161232" watchObservedRunningTime="2026-04-17 11:16:51.791652568 +0000 UTC m=+20.733806809" Apr 17 11:16:51.850530 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:51.850259 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-szct7" podStartSLOduration=3.415985837 podStartE2EDuration="20.850240375s" podCreationTimestamp="2026-04-17 11:16:31 +0000 UTC" firstStartedPulling="2026-04-17 11:16:34.235590126 +0000 UTC m=+3.177744345" lastFinishedPulling="2026-04-17 11:16:51.669844661 +0000 UTC m=+20.611998883" observedRunningTime="2026-04-17 11:16:51.83434908 +0000 UTC m=+20.776503323" watchObservedRunningTime="2026-04-17 11:16:51.850240375 +0000 UTC m=+20.792394617" Apr 17 11:16:51.850851 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:51.850758 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-qxx8v" podStartSLOduration=8.266467479 podStartE2EDuration="20.850745931s" podCreationTimestamp="2026-04-17 11:16:31 +0000 UTC" firstStartedPulling="2026-04-17 11:16:34.236132312 +0000 UTC m=+3.178286745" lastFinishedPulling="2026-04-17 11:16:46.820410962 +0000 UTC m=+15.762565197" observedRunningTime="2026-04-17 11:16:51.849607561 +0000 UTC m=+20.791761804" watchObservedRunningTime="2026-04-17 11:16:51.850745931 +0000 UTC m=+20.792900175" Apr 17 11:16:51.868025 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:51.867985 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-sdhv4" podStartSLOduration=3.701657507 podStartE2EDuration="20.867971729s" podCreationTimestamp="2026-04-17 11:16:31 +0000 UTC" firstStartedPulling="2026-04-17 11:16:34.239036775 +0000 UTC m=+3.181191005" lastFinishedPulling="2026-04-17 11:16:51.405351008 +0000 UTC m=+20.347505227" observedRunningTime="2026-04-17 11:16:51.867812002 +0000 UTC m=+20.809966239" watchObservedRunningTime="2026-04-17 11:16:51.867971729 +0000 UTC m=+20.810125970" Apr 17 11:16:51.883009 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:51.882959 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-8mkn6" podStartSLOduration=3.72034364 podStartE2EDuration="20.882944797s" podCreationTimestamp="2026-04-17 11:16:31 +0000 UTC" firstStartedPulling="2026-04-17 11:16:34.242748283 +0000 UTC m=+3.184902505" lastFinishedPulling="2026-04-17 11:16:51.405349437 +0000 UTC m=+20.347503662" observedRunningTime="2026-04-17 11:16:51.88256142 +0000 UTC m=+20.824715662" watchObservedRunningTime="2026-04-17 11:16:51.882944797 +0000 UTC m=+20.825099037" Apr 17 11:16:52.626500 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:52.626475 2581 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 11:16:52.782857 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:52.782800 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dpf96" event={"ID":"32f662eb-3388-4bc8-9550-6e567567f548","Type":"ContainerStarted","Data":"d47f7cefd36b55c6d46eff5e7afb8d4f823ad7e1c11744d6f86b3183194af5f7"} Apr 17 11:16:52.784038 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:52.784009 2581 generic.go:358] "Generic (PLEG): container finished" podID="a3311f8e-8452-4224-8b40-1d0392b66a65" containerID="9425cd9e7ff81132131eb419d68593837a03f4144b62331eb638f6c0fedf8f1d" exitCode=0 Apr 17 11:16:52.784143 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:52.784073 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b92zr" event={"ID":"a3311f8e-8452-4224-8b40-1d0392b66a65","Type":"ContainerDied","Data":"9425cd9e7ff81132131eb419d68593837a03f4144b62331eb638f6c0fedf8f1d"} Apr 17 11:16:52.786797 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:52.786768 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" event={"ID":"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56","Type":"ContainerStarted","Data":"ee9ea8589b3e83cb804f3fc5fd7a4762142f3803e6a25c65484e189f99e9e166"} Apr 17 11:16:52.786905 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:52.786806 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" event={"ID":"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56","Type":"ContainerStarted","Data":"5811ad21333047cf20ed7c58ef13ff60f34c8434ed582628898b41660f74fa81"} Apr 17 11:16:52.786905 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:52.786839 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" event={"ID":"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56","Type":"ContainerStarted","Data":"8ad0717927cf74337d2ed35a5e8bfad8b73c5bed2b86b3ba4e7b64adc421e30c"} Apr 17 11:16:52.786905 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:52.786853 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" event={"ID":"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56","Type":"ContainerStarted","Data":"7f7389f79dcff6a690ae7e3f297864ffc13918467878fd063446bfaebaa02162"} Apr 17 11:16:52.786905 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:52.786866 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" event={"ID":"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56","Type":"ContainerStarted","Data":"71b11ec93e6a00a5e50cbab7521b41cd5a0fbf5051e9b8083d2ccaf43a618b09"} Apr 17 11:16:53.589233 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:53.589132 2581 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T11:16:52.626496045Z","UUID":"36cd44c5-dba9-43ca-92d5-c53c68d5f769","Handler":null,"Name":"","Endpoint":""} Apr 17 11:16:53.593036 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:53.592963 2581 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 11:16:53.593036 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:53.592992 2581 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 11:16:53.657281 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:53.657243 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t82fh" Apr 17 11:16:53.657475 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:53.657244 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rdxhf" Apr 17 11:16:53.657475 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:53.657369 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t82fh" podUID="09f10455-02ae-4c95-91a9-6c0b6af2b02f" Apr 17 11:16:53.657475 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:53.657430 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rdxhf" podUID="1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f" Apr 17 11:16:53.657475 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:53.657245 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9k5nh" Apr 17 11:16:53.657675 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:53.657576 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9k5nh" podUID="bee0bc88-7732-4010-9886-3df7384bf1c8" Apr 17 11:16:53.790813 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:53.790721 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-fvtct" event={"ID":"505d1e1b-000d-4203-b021-c56c5c5d8c56","Type":"ContainerStarted","Data":"26e616dc8f8511bbd9e6f3c016aa7d69b5c412c0324d6fd09f163459edd03051"} Apr 17 11:16:53.792766 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:53.792735 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dpf96" event={"ID":"32f662eb-3388-4bc8-9550-6e567567f548","Type":"ContainerStarted","Data":"30a188885e316e82a492d2c46d22433a55195d1d7edf370ed82b75f573356a3a"} Apr 17 11:16:53.812375 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:53.812322 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-fvtct" podStartSLOduration=5.649210183 podStartE2EDuration="22.812295001s" podCreationTimestamp="2026-04-17 11:16:31 +0000 UTC" firstStartedPulling="2026-04-17 11:16:34.242295296 +0000 UTC m=+3.184449518" lastFinishedPulling="2026-04-17 11:16:51.405380114 +0000 UTC m=+20.347534336" observedRunningTime="2026-04-17 11:16:53.811896567 +0000 UTC m=+22.754050808" watchObservedRunningTime="2026-04-17 11:16:53.812295001 +0000 UTC m=+22.754449243" Apr 17 11:16:53.840281 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:53.840234 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dpf96" podStartSLOduration=3.584270846 podStartE2EDuration="22.840217616s" podCreationTimestamp="2026-04-17 11:16:31 +0000 UTC" firstStartedPulling="2026-04-17 11:16:34.23342787 +0000 UTC m=+3.175582088" lastFinishedPulling="2026-04-17 11:16:53.489374634 +0000 UTC m=+22.431528858" observedRunningTime="2026-04-17 11:16:53.839715587 +0000 UTC m=+22.781869854" watchObservedRunningTime="2026-04-17 11:16:53.840217616 +0000 UTC m=+22.782371858" Apr 17 11:16:54.800419 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:54.800379 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" event={"ID":"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56","Type":"ContainerStarted","Data":"d28b5984a10887fa4809a5a50d9d1b883a712f0345e86276a0b598c221b3abce"} Apr 17 11:16:55.657520 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:55.657270 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9k5nh" Apr 17 11:16:55.657748 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:55.657270 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t82fh" Apr 17 11:16:55.657748 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:55.657636 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9k5nh" podUID="bee0bc88-7732-4010-9886-3df7384bf1c8" Apr 17 11:16:55.657748 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:55.657271 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rdxhf" Apr 17 11:16:55.657931 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:55.657756 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t82fh" podUID="09f10455-02ae-4c95-91a9-6c0b6af2b02f" Apr 17 11:16:55.657931 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:55.657843 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rdxhf" podUID="1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f" Apr 17 11:16:56.141172 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:56.141136 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-qxx8v" Apr 17 11:16:56.141778 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:56.141750 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-qxx8v" Apr 17 11:16:56.805780 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:56.805750 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-qxx8v" Apr 17 11:16:56.806781 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:56.806764 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-qxx8v" Apr 17 11:16:57.658137 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:57.657951 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t82fh" Apr 17 11:16:57.658973 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:57.657951 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9k5nh" Apr 17 11:16:57.658973 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:57.658236 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t82fh" podUID="09f10455-02ae-4c95-91a9-6c0b6af2b02f" Apr 17 11:16:57.658973 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:57.657951 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rdxhf" Apr 17 11:16:57.658973 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:57.658330 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9k5nh" podUID="bee0bc88-7732-4010-9886-3df7384bf1c8" Apr 17 11:16:57.658973 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:57.658388 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rdxhf" podUID="1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f" Apr 17 11:16:57.807692 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:57.807663 2581 generic.go:358] "Generic (PLEG): container finished" podID="a3311f8e-8452-4224-8b40-1d0392b66a65" containerID="b1abd11da14a02c9d31a40a4d84f93866fcb58bbffb509d2993dd148aa3f1e7e" exitCode=0 Apr 17 11:16:57.807888 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:57.807714 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b92zr" event={"ID":"a3311f8e-8452-4224-8b40-1d0392b66a65","Type":"ContainerDied","Data":"b1abd11da14a02c9d31a40a4d84f93866fcb58bbffb509d2993dd148aa3f1e7e"} Apr 17 11:16:57.810794 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:57.810770 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" event={"ID":"4ab6c44c-fe99-4f3a-a19e-93959e1d3d56","Type":"ContainerStarted","Data":"be28f10837075586f492f7cb69d2c530ff2b621cec28c9466b775ea4fee27414"} Apr 17 11:16:57.811319 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:57.811299 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:57.811416 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:57.811330 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:57.811416 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:57.811352 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:57.826396 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:57.826356 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:57.826557 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:57.826542 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:16:57.857578 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:57.857529 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" podStartSLOduration=9.457759286 podStartE2EDuration="26.857515987s" podCreationTimestamp="2026-04-17 11:16:31 +0000 UTC" firstStartedPulling="2026-04-17 11:16:34.241325654 +0000 UTC m=+3.183479873" lastFinishedPulling="2026-04-17 11:16:51.641082319 +0000 UTC m=+20.583236574" observedRunningTime="2026-04-17 11:16:57.856780106 +0000 UTC m=+26.798934345" watchObservedRunningTime="2026-04-17 11:16:57.857515987 +0000 UTC m=+26.799670227" Apr 17 11:16:58.772784 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:58.772689 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9k5nh"] Apr 17 11:16:58.773237 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:58.772831 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9k5nh" Apr 17 11:16:58.773237 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:58.772923 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9k5nh" podUID="bee0bc88-7732-4010-9886-3df7384bf1c8" Apr 17 11:16:58.779425 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:58.779398 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rdxhf"] Apr 17 11:16:58.779583 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:58.779524 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rdxhf" Apr 17 11:16:58.779658 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:58.779637 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rdxhf" podUID="1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f" Apr 17 11:16:58.780208 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:58.780182 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-t82fh"] Apr 17 11:16:58.780303 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:58.780290 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t82fh" Apr 17 11:16:58.780396 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:16:58.780375 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t82fh" podUID="09f10455-02ae-4c95-91a9-6c0b6af2b02f" Apr 17 11:16:58.814841 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:58.814788 2581 generic.go:358] "Generic (PLEG): container finished" podID="a3311f8e-8452-4224-8b40-1d0392b66a65" containerID="a0369a72c4b9aebd2ca987943a3604f7a6f0f38ebe763ca98e543bf42f82f14c" exitCode=0 Apr 17 11:16:58.814988 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:58.814850 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b92zr" event={"ID":"a3311f8e-8452-4224-8b40-1d0392b66a65","Type":"ContainerDied","Data":"a0369a72c4b9aebd2ca987943a3604f7a6f0f38ebe763ca98e543bf42f82f14c"} Apr 17 11:16:59.818458 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:59.818421 2581 generic.go:358] "Generic (PLEG): container finished" podID="a3311f8e-8452-4224-8b40-1d0392b66a65" containerID="dda8a6863230314d14935ebde0b8e88b36ce9de6d108143392d2e1cba3473c24" exitCode=0 Apr 17 11:16:59.818912 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:16:59.818499 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b92zr" event={"ID":"a3311f8e-8452-4224-8b40-1d0392b66a65","Type":"ContainerDied","Data":"dda8a6863230314d14935ebde0b8e88b36ce9de6d108143392d2e1cba3473c24"} Apr 17 11:17:00.657260 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:00.657228 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t82fh" Apr 17 11:17:00.657459 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:00.657268 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9k5nh" Apr 17 11:17:00.657459 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:00.657349 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rdxhf" Apr 17 11:17:00.657459 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:17:00.657359 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t82fh" podUID="09f10455-02ae-4c95-91a9-6c0b6af2b02f" Apr 17 11:17:00.657673 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:17:00.657456 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rdxhf" podUID="1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f" Apr 17 11:17:00.657673 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:17:00.657565 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9k5nh" podUID="bee0bc88-7732-4010-9886-3df7384bf1c8" Apr 17 11:17:02.657743 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:02.657708 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rdxhf" Apr 17 11:17:02.658333 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:02.657754 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t82fh" Apr 17 11:17:02.658333 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:02.657721 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9k5nh" Apr 17 11:17:02.658333 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:17:02.657850 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rdxhf" podUID="1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f" Apr 17 11:17:02.658333 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:17:02.657941 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9k5nh" podUID="bee0bc88-7732-4010-9886-3df7384bf1c8" Apr 17 11:17:02.658333 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:17:02.658024 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t82fh" podUID="09f10455-02ae-4c95-91a9-6c0b6af2b02f" Apr 17 11:17:04.370444 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.370414 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-177.ec2.internal" event="NodeReady" Apr 17 11:17:04.371001 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.370584 2581 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 11:17:04.421799 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.421729 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-7wbxp"] Apr 17 11:17:04.455112 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.455083 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-rpkz7"] Apr 17 11:17:04.455271 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.455254 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7wbxp" Apr 17 11:17:04.457853 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.457804 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 11:17:04.457853 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.457811 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 11:17:04.458040 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.457874 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-5xzqf\"" Apr 17 11:17:04.458040 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.457962 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 11:17:04.476335 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.476307 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7wbxp"] Apr 17 11:17:04.476335 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.476336 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rpkz7"] Apr 17 11:17:04.476494 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.476450 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rpkz7" Apr 17 11:17:04.479074 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.479042 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 11:17:04.479210 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.479150 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 11:17:04.479277 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.479214 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-v5zg9\"" Apr 17 11:17:04.559658 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.559609 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3-metrics-tls\") pod \"dns-default-rpkz7\" (UID: \"2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3\") " pod="openshift-dns/dns-default-rpkz7" Apr 17 11:17:04.559848 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.559677 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz2rc\" (UniqueName: \"kubernetes.io/projected/8fe4a1b8-871d-4f96-95f0-946d542180da-kube-api-access-sz2rc\") pod \"ingress-canary-7wbxp\" (UID: \"8fe4a1b8-871d-4f96-95f0-946d542180da\") " pod="openshift-ingress-canary/ingress-canary-7wbxp" Apr 17 11:17:04.559848 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.559752 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3-config-volume\") pod \"dns-default-rpkz7\" (UID: \"2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3\") " pod="openshift-dns/dns-default-rpkz7" Apr 17 11:17:04.559848 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.559775 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3-tmp-dir\") pod \"dns-default-rpkz7\" (UID: \"2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3\") " pod="openshift-dns/dns-default-rpkz7" Apr 17 11:17:04.559848 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.559803 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fe4a1b8-871d-4f96-95f0-946d542180da-cert\") pod \"ingress-canary-7wbxp\" (UID: \"8fe4a1b8-871d-4f96-95f0-946d542180da\") " pod="openshift-ingress-canary/ingress-canary-7wbxp" Apr 17 11:17:04.559848 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.559838 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tpxs\" (UniqueName: \"kubernetes.io/projected/2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3-kube-api-access-5tpxs\") pod \"dns-default-rpkz7\" (UID: \"2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3\") " pod="openshift-dns/dns-default-rpkz7" Apr 17 11:17:04.657119 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.657031 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t82fh" Apr 17 11:17:04.657119 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.657032 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rdxhf" Apr 17 11:17:04.657119 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.657050 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9k5nh" Apr 17 11:17:04.660813 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.660206 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3-config-volume\") pod \"dns-default-rpkz7\" (UID: \"2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3\") " pod="openshift-dns/dns-default-rpkz7" Apr 17 11:17:04.660813 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.660234 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 11:17:04.660813 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.660243 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3-tmp-dir\") pod \"dns-default-rpkz7\" (UID: \"2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3\") " pod="openshift-dns/dns-default-rpkz7" Apr 17 11:17:04.660813 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.660276 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fe4a1b8-871d-4f96-95f0-946d542180da-cert\") pod \"ingress-canary-7wbxp\" (UID: \"8fe4a1b8-871d-4f96-95f0-946d542180da\") " pod="openshift-ingress-canary/ingress-canary-7wbxp" Apr 17 11:17:04.660813 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.660305 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5tpxs\" (UniqueName: \"kubernetes.io/projected/2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3-kube-api-access-5tpxs\") pod \"dns-default-rpkz7\" (UID: \"2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3\") " pod="openshift-dns/dns-default-rpkz7" Apr 17 11:17:04.660813 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.660351 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3-metrics-tls\") pod \"dns-default-rpkz7\" (UID: \"2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3\") " pod="openshift-dns/dns-default-rpkz7" Apr 17 11:17:04.660813 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.660377 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 11:17:04.660813 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.660391 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sz2rc\" (UniqueName: \"kubernetes.io/projected/8fe4a1b8-871d-4f96-95f0-946d542180da-kube-api-access-sz2rc\") pod \"ingress-canary-7wbxp\" (UID: \"8fe4a1b8-871d-4f96-95f0-946d542180da\") " pod="openshift-ingress-canary/ingress-canary-7wbxp" Apr 17 11:17:04.660813 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.660458 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ntppc\"" Apr 17 11:17:04.660813 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.660548 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 11:17:04.660813 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:17:04.660615 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:17:04.660813 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:17:04.660676 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3-metrics-tls podName:2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:05.160656386 +0000 UTC m=+34.102810609 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3-metrics-tls") pod "dns-default-rpkz7" (UID: "2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3") : secret "dns-default-metrics-tls" not found Apr 17 11:17:04.660813 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.660696 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-gg44j\"" Apr 17 11:17:04.660813 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:17:04.660733 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:17:04.660813 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.660771 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3-tmp-dir\") pod \"dns-default-rpkz7\" (UID: \"2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3\") " pod="openshift-dns/dns-default-rpkz7" Apr 17 11:17:04.660813 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:17:04.660802 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fe4a1b8-871d-4f96-95f0-946d542180da-cert podName:8fe4a1b8-871d-4f96-95f0-946d542180da nodeName:}" failed. No retries permitted until 2026-04-17 11:17:05.160784171 +0000 UTC m=+34.102938399 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8fe4a1b8-871d-4f96-95f0-946d542180da-cert") pod "ingress-canary-7wbxp" (UID: "8fe4a1b8-871d-4f96-95f0-946d542180da") : secret "canary-serving-cert" not found Apr 17 11:17:04.661594 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.661013 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 11:17:04.661594 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.661248 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3-config-volume\") pod \"dns-default-rpkz7\" (UID: \"2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3\") " pod="openshift-dns/dns-default-rpkz7" Apr 17 11:17:04.672456 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.672429 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tpxs\" (UniqueName: \"kubernetes.io/projected/2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3-kube-api-access-5tpxs\") pod \"dns-default-rpkz7\" (UID: \"2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3\") " pod="openshift-dns/dns-default-rpkz7" Apr 17 11:17:04.672645 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:04.672624 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz2rc\" (UniqueName: \"kubernetes.io/projected/8fe4a1b8-871d-4f96-95f0-946d542180da-kube-api-access-sz2rc\") pod \"ingress-canary-7wbxp\" (UID: \"8fe4a1b8-871d-4f96-95f0-946d542180da\") " pod="openshift-ingress-canary/ingress-canary-7wbxp" Apr 17 11:17:05.164243 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:05.164205 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fe4a1b8-871d-4f96-95f0-946d542180da-cert\") pod \"ingress-canary-7wbxp\" (UID: \"8fe4a1b8-871d-4f96-95f0-946d542180da\") " pod="openshift-ingress-canary/ingress-canary-7wbxp" Apr 17 11:17:05.164420 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:05.164267 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3-metrics-tls\") pod \"dns-default-rpkz7\" (UID: \"2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3\") " pod="openshift-dns/dns-default-rpkz7" Apr 17 11:17:05.164420 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:17:05.164359 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:17:05.164420 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:17:05.164371 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:17:05.164420 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:17:05.164421 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fe4a1b8-871d-4f96-95f0-946d542180da-cert podName:8fe4a1b8-871d-4f96-95f0-946d542180da nodeName:}" failed. No retries permitted until 2026-04-17 11:17:06.1644048 +0000 UTC m=+35.106559024 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8fe4a1b8-871d-4f96-95f0-946d542180da-cert") pod "ingress-canary-7wbxp" (UID: "8fe4a1b8-871d-4f96-95f0-946d542180da") : secret "canary-serving-cert" not found Apr 17 11:17:05.164577 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:17:05.164435 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3-metrics-tls podName:2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:06.164429383 +0000 UTC m=+35.106583602 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3-metrics-tls") pod "dns-default-rpkz7" (UID: "2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3") : secret "dns-default-metrics-tls" not found Apr 17 11:17:05.265628 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:05.265591 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bee0bc88-7732-4010-9886-3df7384bf1c8-metrics-certs\") pod \"network-metrics-daemon-9k5nh\" (UID: \"bee0bc88-7732-4010-9886-3df7384bf1c8\") " pod="openshift-multus/network-metrics-daemon-9k5nh" Apr 17 11:17:05.265800 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:17:05.265727 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 11:17:05.265872 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:17:05.265804 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bee0bc88-7732-4010-9886-3df7384bf1c8-metrics-certs podName:bee0bc88-7732-4010-9886-3df7384bf1c8 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:37.265780468 +0000 UTC m=+66.207934693 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bee0bc88-7732-4010-9886-3df7384bf1c8-metrics-certs") pod "network-metrics-daemon-9k5nh" (UID: "bee0bc88-7732-4010-9886-3df7384bf1c8") : secret "metrics-daemon-secret" not found Apr 17 11:17:05.365966 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:05.365923 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dbwzg\" (UniqueName: \"kubernetes.io/projected/1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f-kube-api-access-dbwzg\") pod \"network-check-target-rdxhf\" (UID: \"1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f\") " pod="openshift-network-diagnostics/network-check-target-rdxhf" Apr 17 11:17:05.368759 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:05.368727 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbwzg\" (UniqueName: \"kubernetes.io/projected/1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f-kube-api-access-dbwzg\") pod \"network-check-target-rdxhf\" (UID: \"1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f\") " pod="openshift-network-diagnostics/network-check-target-rdxhf" Apr 17 11:17:05.567188 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:05.567095 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/09f10455-02ae-4c95-91a9-6c0b6af2b02f-original-pull-secret\") pod \"global-pull-secret-syncer-t82fh\" (UID: \"09f10455-02ae-4c95-91a9-6c0b6af2b02f\") " pod="kube-system/global-pull-secret-syncer-t82fh" Apr 17 11:17:05.569906 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:05.569872 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/09f10455-02ae-4c95-91a9-6c0b6af2b02f-original-pull-secret\") pod \"global-pull-secret-syncer-t82fh\" (UID: \"09f10455-02ae-4c95-91a9-6c0b6af2b02f\") " pod="kube-system/global-pull-secret-syncer-t82fh" Apr 17 11:17:05.570044 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:05.569980 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t82fh" Apr 17 11:17:05.591047 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:05.591006 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rdxhf" Apr 17 11:17:06.069781 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:06.069735 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-t82fh"] Apr 17 11:17:06.070558 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:06.070527 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rdxhf"] Apr 17 11:17:06.130299 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:17:06.130265 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1679a0d1_a3f2_40a1_aa7a_e0d8183e0f7f.slice/crio-643dc1a1d175357fba548a23b8c4a939df838d041f6da17aa7f002c95c42266e WatchSource:0}: Error finding container 643dc1a1d175357fba548a23b8c4a939df838d041f6da17aa7f002c95c42266e: Status 404 returned error can't find the container with id 643dc1a1d175357fba548a23b8c4a939df838d041f6da17aa7f002c95c42266e Apr 17 11:17:06.130687 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:17:06.130667 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09f10455_02ae_4c95_91a9_6c0b6af2b02f.slice/crio-334f73fd5dd85dfa2cc7da361ea6bcb79a7b93518bfd4acdaf319335fb90be12 WatchSource:0}: Error finding container 334f73fd5dd85dfa2cc7da361ea6bcb79a7b93518bfd4acdaf319335fb90be12: Status 404 returned error can't find the container with id 334f73fd5dd85dfa2cc7da361ea6bcb79a7b93518bfd4acdaf319335fb90be12 Apr 17 11:17:06.173643 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:06.173622 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fe4a1b8-871d-4f96-95f0-946d542180da-cert\") pod \"ingress-canary-7wbxp\" (UID: \"8fe4a1b8-871d-4f96-95f0-946d542180da\") " pod="openshift-ingress-canary/ingress-canary-7wbxp" Apr 17 11:17:06.173731 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:06.173677 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3-metrics-tls\") pod \"dns-default-rpkz7\" (UID: \"2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3\") " pod="openshift-dns/dns-default-rpkz7" Apr 17 11:17:06.173780 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:17:06.173769 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:17:06.173841 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:17:06.173770 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:17:06.173880 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:17:06.173849 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3-metrics-tls podName:2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:08.173812252 +0000 UTC m=+37.115966471 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3-metrics-tls") pod "dns-default-rpkz7" (UID: "2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3") : secret "dns-default-metrics-tls" not found Apr 17 11:17:06.173880 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:17:06.173877 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fe4a1b8-871d-4f96-95f0-946d542180da-cert podName:8fe4a1b8-871d-4f96-95f0-946d542180da nodeName:}" failed. No retries permitted until 2026-04-17 11:17:08.17386096 +0000 UTC m=+37.116015191 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8fe4a1b8-871d-4f96-95f0-946d542180da-cert") pod "ingress-canary-7wbxp" (UID: "8fe4a1b8-871d-4f96-95f0-946d542180da") : secret "canary-serving-cert" not found Apr 17 11:17:06.835231 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:06.835147 2581 generic.go:358] "Generic (PLEG): container finished" podID="a3311f8e-8452-4224-8b40-1d0392b66a65" containerID="6afb5e8e4f8baf8a70be70e77bc6c534d935013344a6313f433cb2f98dd4596b" exitCode=0 Apr 17 11:17:06.835707 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:06.835249 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b92zr" event={"ID":"a3311f8e-8452-4224-8b40-1d0392b66a65","Type":"ContainerDied","Data":"6afb5e8e4f8baf8a70be70e77bc6c534d935013344a6313f433cb2f98dd4596b"} Apr 17 11:17:06.836770 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:06.836733 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-t82fh" event={"ID":"09f10455-02ae-4c95-91a9-6c0b6af2b02f","Type":"ContainerStarted","Data":"334f73fd5dd85dfa2cc7da361ea6bcb79a7b93518bfd4acdaf319335fb90be12"} Apr 17 11:17:06.837891 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:06.837856 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rdxhf" event={"ID":"1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f","Type":"ContainerStarted","Data":"643dc1a1d175357fba548a23b8c4a939df838d041f6da17aa7f002c95c42266e"} Apr 17 11:17:07.843623 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:07.843510 2581 generic.go:358] "Generic (PLEG): container finished" podID="a3311f8e-8452-4224-8b40-1d0392b66a65" containerID="301d5b935e09c1c734935e594964f5dec2b3155536984e9d8d7ebd1295fb85e8" exitCode=0 Apr 17 11:17:07.843623 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:07.843580 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b92zr" event={"ID":"a3311f8e-8452-4224-8b40-1d0392b66a65","Type":"ContainerDied","Data":"301d5b935e09c1c734935e594964f5dec2b3155536984e9d8d7ebd1295fb85e8"} Apr 17 11:17:08.190211 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:08.189949 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fe4a1b8-871d-4f96-95f0-946d542180da-cert\") pod \"ingress-canary-7wbxp\" (UID: \"8fe4a1b8-871d-4f96-95f0-946d542180da\") " pod="openshift-ingress-canary/ingress-canary-7wbxp" Apr 17 11:17:08.190398 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:08.190239 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3-metrics-tls\") pod \"dns-default-rpkz7\" (UID: \"2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3\") " pod="openshift-dns/dns-default-rpkz7" Apr 17 11:17:08.190398 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:17:08.190120 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:17:08.190398 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:17:08.190355 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fe4a1b8-871d-4f96-95f0-946d542180da-cert podName:8fe4a1b8-871d-4f96-95f0-946d542180da nodeName:}" failed. No retries permitted until 2026-04-17 11:17:12.190331989 +0000 UTC m=+41.132486212 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8fe4a1b8-871d-4f96-95f0-946d542180da-cert") pod "ingress-canary-7wbxp" (UID: "8fe4a1b8-871d-4f96-95f0-946d542180da") : secret "canary-serving-cert" not found Apr 17 11:17:08.190572 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:17:08.190409 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:17:08.190572 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:17:08.190459 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3-metrics-tls podName:2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:12.190441775 +0000 UTC m=+41.132595997 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3-metrics-tls") pod "dns-default-rpkz7" (UID: "2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3") : secret "dns-default-metrics-tls" not found Apr 17 11:17:08.849641 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:08.849604 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b92zr" event={"ID":"a3311f8e-8452-4224-8b40-1d0392b66a65","Type":"ContainerStarted","Data":"c74bf431c85f1c8733d36d613515fd09848a3bd0c5c0c6faffc045abcc7e099b"} Apr 17 11:17:08.873996 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:08.873937 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-b92zr" podStartSLOduration=5.946013282 podStartE2EDuration="37.873915456s" podCreationTimestamp="2026-04-17 11:16:31 +0000 UTC" firstStartedPulling="2026-04-17 11:16:34.232492112 +0000 UTC m=+3.174646337" lastFinishedPulling="2026-04-17 11:17:06.160394292 +0000 UTC m=+35.102548511" observedRunningTime="2026-04-17 11:17:08.872604617 +0000 UTC m=+37.814758859" watchObservedRunningTime="2026-04-17 11:17:08.873915456 +0000 UTC m=+37.816069691" Apr 17 11:17:09.000758 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:09.000724 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6db7d65f79-r8wpx"] Apr 17 11:17:09.015108 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:09.015077 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6db7d65f79-r8wpx" Apr 17 11:17:09.017876 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:09.017848 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6db7d65f79-r8wpx"] Apr 17 11:17:09.018115 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:09.018090 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-hbqmn\"" Apr 17 11:17:09.019140 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:09.019119 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 11:17:09.019241 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:09.019164 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 17 11:17:09.019489 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:09.019471 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 11:17:09.019572 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:09.019488 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 11:17:09.098750 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:09.098715 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5c8fd482-126f-4f85-ab60-3c44fd0c243e-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6db7d65f79-r8wpx\" (UID: \"5c8fd482-126f-4f85-ab60-3c44fd0c243e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6db7d65f79-r8wpx" Apr 17 11:17:09.098961 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:09.098775 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhpnc\" (UniqueName: \"kubernetes.io/projected/5c8fd482-126f-4f85-ab60-3c44fd0c243e-kube-api-access-jhpnc\") pod \"managed-serviceaccount-addon-agent-6db7d65f79-r8wpx\" (UID: \"5c8fd482-126f-4f85-ab60-3c44fd0c243e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6db7d65f79-r8wpx" Apr 17 11:17:09.199763 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:09.199727 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5c8fd482-126f-4f85-ab60-3c44fd0c243e-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6db7d65f79-r8wpx\" (UID: \"5c8fd482-126f-4f85-ab60-3c44fd0c243e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6db7d65f79-r8wpx" Apr 17 11:17:09.199963 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:09.199784 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhpnc\" (UniqueName: \"kubernetes.io/projected/5c8fd482-126f-4f85-ab60-3c44fd0c243e-kube-api-access-jhpnc\") pod \"managed-serviceaccount-addon-agent-6db7d65f79-r8wpx\" (UID: \"5c8fd482-126f-4f85-ab60-3c44fd0c243e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6db7d65f79-r8wpx" Apr 17 11:17:09.204089 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:09.204065 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5c8fd482-126f-4f85-ab60-3c44fd0c243e-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6db7d65f79-r8wpx\" (UID: \"5c8fd482-126f-4f85-ab60-3c44fd0c243e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6db7d65f79-r8wpx" Apr 17 11:17:09.209685 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:09.209662 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhpnc\" (UniqueName: \"kubernetes.io/projected/5c8fd482-126f-4f85-ab60-3c44fd0c243e-kube-api-access-jhpnc\") pod \"managed-serviceaccount-addon-agent-6db7d65f79-r8wpx\" (UID: \"5c8fd482-126f-4f85-ab60-3c44fd0c243e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6db7d65f79-r8wpx" Apr 17 11:17:09.334303 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:09.334255 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6db7d65f79-r8wpx" Apr 17 11:17:10.988736 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:10.988690 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6db7d65f79-r8wpx"] Apr 17 11:17:11.000016 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:17:10.999984 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c8fd482_126f_4f85_ab60_3c44fd0c243e.slice/crio-013d4a81de8e4fa7078bb0e5139d1e904c733bec32c799e85a6b9d55c7a98377 WatchSource:0}: Error finding container 013d4a81de8e4fa7078bb0e5139d1e904c733bec32c799e85a6b9d55c7a98377: Status 404 returned error can't find the container with id 013d4a81de8e4fa7078bb0e5139d1e904c733bec32c799e85a6b9d55c7a98377 Apr 17 11:17:11.862629 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:11.862590 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-t82fh" event={"ID":"09f10455-02ae-4c95-91a9-6c0b6af2b02f","Type":"ContainerStarted","Data":"9fb6616d64771107637fd17c7cc26c536b74187648392cce290d76f498d62859"} Apr 17 11:17:11.864565 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:11.864534 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rdxhf" event={"ID":"1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f","Type":"ContainerStarted","Data":"24e70fba814111d12cefe294172ca61812f014d38e9ab32f43bf90c9f03fa224"} Apr 17 11:17:11.864740 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:11.864719 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-rdxhf" Apr 17 11:17:11.865752 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:11.865722 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6db7d65f79-r8wpx" event={"ID":"5c8fd482-126f-4f85-ab60-3c44fd0c243e","Type":"ContainerStarted","Data":"013d4a81de8e4fa7078bb0e5139d1e904c733bec32c799e85a6b9d55c7a98377"} Apr 17 11:17:11.881482 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:11.881371 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-t82fh" podStartSLOduration=34.143346179 podStartE2EDuration="38.881356492s" podCreationTimestamp="2026-04-17 11:16:33 +0000 UTC" firstStartedPulling="2026-04-17 11:17:06.137288727 +0000 UTC m=+35.079442952" lastFinishedPulling="2026-04-17 11:17:10.875299041 +0000 UTC m=+39.817453265" observedRunningTime="2026-04-17 11:17:11.880907957 +0000 UTC m=+40.823062200" watchObservedRunningTime="2026-04-17 11:17:11.881356492 +0000 UTC m=+40.823510736" Apr 17 11:17:11.899282 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:11.899209 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-rdxhf" podStartSLOduration=36.171394219 podStartE2EDuration="40.899156237s" podCreationTimestamp="2026-04-17 11:16:31 +0000 UTC" firstStartedPulling="2026-04-17 11:17:06.137244762 +0000 UTC m=+35.079398981" lastFinishedPulling="2026-04-17 11:17:10.86500677 +0000 UTC m=+39.807160999" observedRunningTime="2026-04-17 11:17:11.898535895 +0000 UTC m=+40.840690134" watchObservedRunningTime="2026-04-17 11:17:11.899156237 +0000 UTC m=+40.841310480" Apr 17 11:17:12.224017 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:12.223943 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fe4a1b8-871d-4f96-95f0-946d542180da-cert\") pod \"ingress-canary-7wbxp\" (UID: \"8fe4a1b8-871d-4f96-95f0-946d542180da\") " pod="openshift-ingress-canary/ingress-canary-7wbxp" Apr 17 11:17:12.224017 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:12.223983 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3-metrics-tls\") pod \"dns-default-rpkz7\" (UID: \"2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3\") " pod="openshift-dns/dns-default-rpkz7" Apr 17 11:17:12.224387 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:17:12.224070 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:17:12.224387 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:17:12.224073 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:17:12.224387 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:17:12.224126 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3-metrics-tls podName:2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:20.224109568 +0000 UTC m=+49.166263787 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3-metrics-tls") pod "dns-default-rpkz7" (UID: "2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3") : secret "dns-default-metrics-tls" not found Apr 17 11:17:12.224387 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:17:12.224141 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fe4a1b8-871d-4f96-95f0-946d542180da-cert podName:8fe4a1b8-871d-4f96-95f0-946d542180da nodeName:}" failed. No retries permitted until 2026-04-17 11:17:20.22413485 +0000 UTC m=+49.166289069 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8fe4a1b8-871d-4f96-95f0-946d542180da-cert") pod "ingress-canary-7wbxp" (UID: "8fe4a1b8-871d-4f96-95f0-946d542180da") : secret "canary-serving-cert" not found Apr 17 11:17:14.872292 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:14.872262 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6db7d65f79-r8wpx" event={"ID":"5c8fd482-126f-4f85-ab60-3c44fd0c243e","Type":"ContainerStarted","Data":"04d5fbf8b2b1bffe3ce691dab16be52f8e565771da2f38c46f7a46300b86908a"} Apr 17 11:17:14.900319 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:14.900271 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6db7d65f79-r8wpx" podStartSLOduration=3.8268605730000003 podStartE2EDuration="6.900253159s" podCreationTimestamp="2026-04-17 11:17:08 +0000 UTC" firstStartedPulling="2026-04-17 11:17:11.002153315 +0000 UTC m=+39.944307534" lastFinishedPulling="2026-04-17 11:17:14.075545901 +0000 UTC m=+43.017700120" observedRunningTime="2026-04-17 11:17:14.899672246 +0000 UTC m=+43.841826487" watchObservedRunningTime="2026-04-17 11:17:14.900253159 +0000 UTC m=+43.842407440" Apr 17 11:17:20.278311 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:20.278267 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fe4a1b8-871d-4f96-95f0-946d542180da-cert\") pod \"ingress-canary-7wbxp\" (UID: \"8fe4a1b8-871d-4f96-95f0-946d542180da\") " pod="openshift-ingress-canary/ingress-canary-7wbxp" Apr 17 11:17:20.278311 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:20.278315 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3-metrics-tls\") pod \"dns-default-rpkz7\" (UID: \"2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3\") " pod="openshift-dns/dns-default-rpkz7" Apr 17 11:17:20.278728 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:17:20.278421 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:17:20.278728 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:17:20.278426 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:17:20.278728 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:17:20.278483 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3-metrics-tls podName:2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:36.278469078 +0000 UTC m=+65.220623297 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3-metrics-tls") pod "dns-default-rpkz7" (UID: "2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3") : secret "dns-default-metrics-tls" not found Apr 17 11:17:20.278728 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:17:20.278498 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fe4a1b8-871d-4f96-95f0-946d542180da-cert podName:8fe4a1b8-871d-4f96-95f0-946d542180da nodeName:}" failed. No retries permitted until 2026-04-17 11:17:36.278490512 +0000 UTC m=+65.220644731 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8fe4a1b8-871d-4f96-95f0-946d542180da-cert") pod "ingress-canary-7wbxp" (UID: "8fe4a1b8-871d-4f96-95f0-946d542180da") : secret "canary-serving-cert" not found Apr 17 11:17:29.832710 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:29.832676 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q4q4t" Apr 17 11:17:36.292904 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:36.292863 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fe4a1b8-871d-4f96-95f0-946d542180da-cert\") pod \"ingress-canary-7wbxp\" (UID: \"8fe4a1b8-871d-4f96-95f0-946d542180da\") " pod="openshift-ingress-canary/ingress-canary-7wbxp" Apr 17 11:17:36.292904 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:36.292910 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3-metrics-tls\") pod \"dns-default-rpkz7\" (UID: \"2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3\") " pod="openshift-dns/dns-default-rpkz7" Apr 17 11:17:36.293331 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:17:36.293025 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:17:36.293331 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:17:36.293027 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:17:36.293331 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:17:36.293091 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3-metrics-tls podName:2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:08.293076355 +0000 UTC m=+97.235230574 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3-metrics-tls") pod "dns-default-rpkz7" (UID: "2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3") : secret "dns-default-metrics-tls" not found Apr 17 11:17:36.293331 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:17:36.293104 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fe4a1b8-871d-4f96-95f0-946d542180da-cert podName:8fe4a1b8-871d-4f96-95f0-946d542180da nodeName:}" failed. No retries permitted until 2026-04-17 11:18:08.293098689 +0000 UTC m=+97.235252908 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8fe4a1b8-871d-4f96-95f0-946d542180da-cert") pod "ingress-canary-7wbxp" (UID: "8fe4a1b8-871d-4f96-95f0-946d542180da") : secret "canary-serving-cert" not found Apr 17 11:17:37.298622 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:37.298575 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bee0bc88-7732-4010-9886-3df7384bf1c8-metrics-certs\") pod \"network-metrics-daemon-9k5nh\" (UID: \"bee0bc88-7732-4010-9886-3df7384bf1c8\") " pod="openshift-multus/network-metrics-daemon-9k5nh" Apr 17 11:17:37.299140 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:17:37.298740 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 11:17:37.299140 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:17:37.298851 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bee0bc88-7732-4010-9886-3df7384bf1c8-metrics-certs podName:bee0bc88-7732-4010-9886-3df7384bf1c8 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:41.298806111 +0000 UTC m=+130.240960335 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bee0bc88-7732-4010-9886-3df7384bf1c8-metrics-certs") pod "network-metrics-daemon-9k5nh" (UID: "bee0bc88-7732-4010-9886-3df7384bf1c8") : secret "metrics-daemon-secret" not found Apr 17 11:17:42.870576 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:17:42.870544 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-rdxhf" Apr 17 11:18:08.304004 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:08.303968 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fe4a1b8-871d-4f96-95f0-946d542180da-cert\") pod \"ingress-canary-7wbxp\" (UID: \"8fe4a1b8-871d-4f96-95f0-946d542180da\") " pod="openshift-ingress-canary/ingress-canary-7wbxp" Apr 17 11:18:08.304438 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:08.304012 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3-metrics-tls\") pod \"dns-default-rpkz7\" (UID: \"2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3\") " pod="openshift-dns/dns-default-rpkz7" Apr 17 11:18:08.304438 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:08.304107 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:18:08.304438 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:08.304132 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:18:08.304438 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:08.304160 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3-metrics-tls podName:2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3 nodeName:}" failed. No retries permitted until 2026-04-17 11:19:12.30414624 +0000 UTC m=+161.246300459 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3-metrics-tls") pod "dns-default-rpkz7" (UID: "2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3") : secret "dns-default-metrics-tls" not found Apr 17 11:18:08.304438 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:08.304210 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fe4a1b8-871d-4f96-95f0-946d542180da-cert podName:8fe4a1b8-871d-4f96-95f0-946d542180da nodeName:}" failed. No retries permitted until 2026-04-17 11:19:12.304191988 +0000 UTC m=+161.246346218 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8fe4a1b8-871d-4f96-95f0-946d542180da-cert") pod "ingress-canary-7wbxp" (UID: "8fe4a1b8-871d-4f96-95f0-946d542180da") : secret "canary-serving-cert" not found Apr 17 11:18:41.335224 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:41.335182 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bee0bc88-7732-4010-9886-3df7384bf1c8-metrics-certs\") pod \"network-metrics-daemon-9k5nh\" (UID: \"bee0bc88-7732-4010-9886-3df7384bf1c8\") " pod="openshift-multus/network-metrics-daemon-9k5nh" Apr 17 11:18:41.335727 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:41.335329 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 11:18:41.335727 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:41.335411 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bee0bc88-7732-4010-9886-3df7384bf1c8-metrics-certs podName:bee0bc88-7732-4010-9886-3df7384bf1c8 nodeName:}" failed. No retries permitted until 2026-04-17 11:20:43.335394999 +0000 UTC m=+252.277549222 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bee0bc88-7732-4010-9886-3df7384bf1c8-metrics-certs") pod "network-metrics-daemon-9k5nh" (UID: "bee0bc88-7732-4010-9886-3df7384bf1c8") : secret "metrics-daemon-secret" not found Apr 17 11:18:43.058397 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.058357 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5f8c5cdcb8-ts845"] Apr 17 11:18:43.061042 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.061025 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5f8c5cdcb8-ts845" Apr 17 11:18:43.063591 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.063567 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 17 11:18:43.063839 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.063807 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-pxldv\"" Apr 17 11:18:43.063919 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.063808 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 11:18:43.063919 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.063870 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 17 11:18:43.063919 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.063907 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 11:18:43.064075 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.063931 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 17 11:18:43.064075 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.064042 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 17 11:18:43.069866 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.069846 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5f8c5cdcb8-ts845"] Apr 17 11:18:43.147468 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.147428 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-metrics-certs\") pod \"router-default-5f8c5cdcb8-ts845\" (UID: \"d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707\") " pod="openshift-ingress/router-default-5f8c5cdcb8-ts845" Apr 17 11:18:43.147639 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.147496 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-stats-auth\") pod \"router-default-5f8c5cdcb8-ts845\" (UID: \"d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707\") " pod="openshift-ingress/router-default-5f8c5cdcb8-ts845" Apr 17 11:18:43.147639 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.147548 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-service-ca-bundle\") pod \"router-default-5f8c5cdcb8-ts845\" (UID: \"d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707\") " pod="openshift-ingress/router-default-5f8c5cdcb8-ts845" Apr 17 11:18:43.147639 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.147588 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-default-certificate\") pod \"router-default-5f8c5cdcb8-ts845\" (UID: \"d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707\") " pod="openshift-ingress/router-default-5f8c5cdcb8-ts845" Apr 17 11:18:43.147639 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.147607 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7qqp\" (UniqueName: \"kubernetes.io/projected/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-kube-api-access-f7qqp\") pod \"router-default-5f8c5cdcb8-ts845\" (UID: \"d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707\") " pod="openshift-ingress/router-default-5f8c5cdcb8-ts845" Apr 17 11:18:43.179056 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.179021 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-zdr26"] Apr 17 11:18:43.181771 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.181755 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-zdr26" Apr 17 11:18:43.190480 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.190460 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-bx66d\"" Apr 17 11:18:43.191696 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.191680 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 17 11:18:43.205169 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.205145 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rkxx4"] Apr 17 11:18:43.208019 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.208004 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rkxx4" Apr 17 11:18:43.209650 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.209635 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 17 11:18:43.226452 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.226426 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-zvxxb\"" Apr 17 11:18:43.226452 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.226426 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:18:43.229261 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.229245 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 17 11:18:43.243938 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.243911 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 17 11:18:43.247354 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.247330 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7886847858-kdzp6"] Apr 17 11:18:43.248673 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.248656 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-metrics-certs\") pod \"router-default-5f8c5cdcb8-ts845\" (UID: \"d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707\") " pod="openshift-ingress/router-default-5f8c5cdcb8-ts845" Apr 17 11:18:43.248738 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.248700 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-stats-auth\") pod \"router-default-5f8c5cdcb8-ts845\" (UID: \"d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707\") " pod="openshift-ingress/router-default-5f8c5cdcb8-ts845" Apr 17 11:18:43.248738 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.248727 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-service-ca-bundle\") pod \"router-default-5f8c5cdcb8-ts845\" (UID: \"d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707\") " pod="openshift-ingress/router-default-5f8c5cdcb8-ts845" Apr 17 11:18:43.248846 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.248747 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/76699d2b-150e-4629-9c16-03548712a64f-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-zdr26\" (UID: \"76699d2b-150e-4629-9c16-03548712a64f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zdr26" Apr 17 11:18:43.248846 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:43.248835 2581 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 11:18:43.248949 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:43.248848 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-service-ca-bundle podName:d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:43.748827181 +0000 UTC m=+132.690981413 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-service-ca-bundle") pod "router-default-5f8c5cdcb8-ts845" (UID: "d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707") : configmap references non-existent config key: service-ca.crt Apr 17 11:18:43.248949 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.248873 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-default-certificate\") pod \"router-default-5f8c5cdcb8-ts845\" (UID: \"d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707\") " pod="openshift-ingress/router-default-5f8c5cdcb8-ts845" Apr 17 11:18:43.248949 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:43.248903 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-metrics-certs podName:d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:43.748885499 +0000 UTC m=+132.691039718 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-metrics-certs") pod "router-default-5f8c5cdcb8-ts845" (UID: "d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707") : secret "router-metrics-certs-default" not found Apr 17 11:18:43.249095 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.248944 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7qqp\" (UniqueName: \"kubernetes.io/projected/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-kube-api-access-f7qqp\") pod \"router-default-5f8c5cdcb8-ts845\" (UID: \"d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707\") " pod="openshift-ingress/router-default-5f8c5cdcb8-ts845" Apr 17 11:18:43.249095 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.248979 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/76699d2b-150e-4629-9c16-03548712a64f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zdr26\" (UID: \"76699d2b-150e-4629-9c16-03548712a64f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zdr26" Apr 17 11:18:43.250114 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.250094 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7886847858-kdzp6" Apr 17 11:18:43.251776 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.251756 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-stats-auth\") pod \"router-default-5f8c5cdcb8-ts845\" (UID: \"d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707\") " pod="openshift-ingress/router-default-5f8c5cdcb8-ts845" Apr 17 11:18:43.251881 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.251787 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-default-certificate\") pod \"router-default-5f8c5cdcb8-ts845\" (UID: \"d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707\") " pod="openshift-ingress/router-default-5f8c5cdcb8-ts845" Apr 17 11:18:43.264669 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.264642 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 11:18:43.274285 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.274261 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 11:18:43.274410 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.274315 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-qf8qh\"" Apr 17 11:18:43.274995 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.274979 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 11:18:43.283097 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.283074 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 11:18:43.287635 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.287615 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7qqp\" (UniqueName: \"kubernetes.io/projected/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-kube-api-access-f7qqp\") pod \"router-default-5f8c5cdcb8-ts845\" (UID: \"d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707\") " pod="openshift-ingress/router-default-5f8c5cdcb8-ts845" Apr 17 11:18:43.294189 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.294169 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rkxx4"] Apr 17 11:18:43.302625 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.302605 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7886847858-kdzp6"] Apr 17 11:18:43.333957 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.333873 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-zdr26"] Apr 17 11:18:43.349628 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.349601 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e812ce5b-8977-4351-b49f-0cbf7496a798-trusted-ca\") pod \"image-registry-7886847858-kdzp6\" (UID: \"e812ce5b-8977-4351-b49f-0cbf7496a798\") " pod="openshift-image-registry/image-registry-7886847858-kdzp6" Apr 17 11:18:43.349628 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.349632 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a2b7e351-8153-4e2a-b2c0-4127f4670018-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rkxx4\" (UID: \"a2b7e351-8153-4e2a-b2c0-4127f4670018\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rkxx4" Apr 17 11:18:43.349813 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.349673 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e812ce5b-8977-4351-b49f-0cbf7496a798-registry-certificates\") pod \"image-registry-7886847858-kdzp6\" (UID: \"e812ce5b-8977-4351-b49f-0cbf7496a798\") " pod="openshift-image-registry/image-registry-7886847858-kdzp6" Apr 17 11:18:43.349813 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.349750 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/76699d2b-150e-4629-9c16-03548712a64f-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-zdr26\" (UID: \"76699d2b-150e-4629-9c16-03548712a64f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zdr26" Apr 17 11:18:43.349813 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.349768 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e812ce5b-8977-4351-b49f-0cbf7496a798-bound-sa-token\") pod \"image-registry-7886847858-kdzp6\" (UID: \"e812ce5b-8977-4351-b49f-0cbf7496a798\") " pod="openshift-image-registry/image-registry-7886847858-kdzp6" Apr 17 11:18:43.349813 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.349791 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/76699d2b-150e-4629-9c16-03548712a64f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zdr26\" (UID: \"76699d2b-150e-4629-9c16-03548712a64f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zdr26" Apr 17 11:18:43.349813 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.349806 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e812ce5b-8977-4351-b49f-0cbf7496a798-registry-tls\") pod \"image-registry-7886847858-kdzp6\" (UID: \"e812ce5b-8977-4351-b49f-0cbf7496a798\") " pod="openshift-image-registry/image-registry-7886847858-kdzp6" Apr 17 11:18:43.350036 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.349837 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e812ce5b-8977-4351-b49f-0cbf7496a798-installation-pull-secrets\") pod \"image-registry-7886847858-kdzp6\" (UID: \"e812ce5b-8977-4351-b49f-0cbf7496a798\") " pod="openshift-image-registry/image-registry-7886847858-kdzp6" Apr 17 11:18:43.350036 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.349855 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e812ce5b-8977-4351-b49f-0cbf7496a798-ca-trust-extracted\") pod \"image-registry-7886847858-kdzp6\" (UID: \"e812ce5b-8977-4351-b49f-0cbf7496a798\") " pod="openshift-image-registry/image-registry-7886847858-kdzp6" Apr 17 11:18:43.350036 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.349894 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lf9h7\" (UniqueName: \"kubernetes.io/projected/e812ce5b-8977-4351-b49f-0cbf7496a798-kube-api-access-lf9h7\") pod \"image-registry-7886847858-kdzp6\" (UID: \"e812ce5b-8977-4351-b49f-0cbf7496a798\") " pod="openshift-image-registry/image-registry-7886847858-kdzp6" Apr 17 11:18:43.350036 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:43.349919 2581 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 11:18:43.350036 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.349978 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwpmm\" (UniqueName: \"kubernetes.io/projected/a2b7e351-8153-4e2a-b2c0-4127f4670018-kube-api-access-hwpmm\") pod \"cluster-samples-operator-6dc5bdb6b4-rkxx4\" (UID: \"a2b7e351-8153-4e2a-b2c0-4127f4670018\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rkxx4" Apr 17 11:18:43.350036 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:43.350030 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76699d2b-150e-4629-9c16-03548712a64f-networking-console-plugin-cert podName:76699d2b-150e-4629-9c16-03548712a64f nodeName:}" failed. No retries permitted until 2026-04-17 11:18:43.85000692 +0000 UTC m=+132.792161156 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/76699d2b-150e-4629-9c16-03548712a64f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-zdr26" (UID: "76699d2b-150e-4629-9c16-03548712a64f") : secret "networking-console-plugin-cert" not found Apr 17 11:18:43.350301 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.350092 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e812ce5b-8977-4351-b49f-0cbf7496a798-image-registry-private-configuration\") pod \"image-registry-7886847858-kdzp6\" (UID: \"e812ce5b-8977-4351-b49f-0cbf7496a798\") " pod="openshift-image-registry/image-registry-7886847858-kdzp6" Apr 17 11:18:43.350402 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.350383 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/76699d2b-150e-4629-9c16-03548712a64f-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-zdr26\" (UID: \"76699d2b-150e-4629-9c16-03548712a64f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zdr26" Apr 17 11:18:43.450743 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.450695 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e812ce5b-8977-4351-b49f-0cbf7496a798-registry-tls\") pod \"image-registry-7886847858-kdzp6\" (UID: \"e812ce5b-8977-4351-b49f-0cbf7496a798\") " pod="openshift-image-registry/image-registry-7886847858-kdzp6" Apr 17 11:18:43.450743 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.450738 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e812ce5b-8977-4351-b49f-0cbf7496a798-installation-pull-secrets\") pod \"image-registry-7886847858-kdzp6\" (UID: \"e812ce5b-8977-4351-b49f-0cbf7496a798\") " pod="openshift-image-registry/image-registry-7886847858-kdzp6" Apr 17 11:18:43.450980 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.450759 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e812ce5b-8977-4351-b49f-0cbf7496a798-ca-trust-extracted\") pod \"image-registry-7886847858-kdzp6\" (UID: \"e812ce5b-8977-4351-b49f-0cbf7496a798\") " pod="openshift-image-registry/image-registry-7886847858-kdzp6" Apr 17 11:18:43.450980 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.450776 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lf9h7\" (UniqueName: \"kubernetes.io/projected/e812ce5b-8977-4351-b49f-0cbf7496a798-kube-api-access-lf9h7\") pod \"image-registry-7886847858-kdzp6\" (UID: \"e812ce5b-8977-4351-b49f-0cbf7496a798\") " pod="openshift-image-registry/image-registry-7886847858-kdzp6" Apr 17 11:18:43.450980 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:43.450867 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:18:43.450980 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:43.450892 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7886847858-kdzp6: secret "image-registry-tls" not found Apr 17 11:18:43.450980 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.450892 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hwpmm\" (UniqueName: \"kubernetes.io/projected/a2b7e351-8153-4e2a-b2c0-4127f4670018-kube-api-access-hwpmm\") pod \"cluster-samples-operator-6dc5bdb6b4-rkxx4\" (UID: \"a2b7e351-8153-4e2a-b2c0-4127f4670018\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rkxx4" Apr 17 11:18:43.450980 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.450935 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e812ce5b-8977-4351-b49f-0cbf7496a798-image-registry-private-configuration\") pod \"image-registry-7886847858-kdzp6\" (UID: \"e812ce5b-8977-4351-b49f-0cbf7496a798\") " pod="openshift-image-registry/image-registry-7886847858-kdzp6" Apr 17 11:18:43.450980 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:43.450951 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e812ce5b-8977-4351-b49f-0cbf7496a798-registry-tls podName:e812ce5b-8977-4351-b49f-0cbf7496a798 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:43.950930854 +0000 UTC m=+132.893085087 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e812ce5b-8977-4351-b49f-0cbf7496a798-registry-tls") pod "image-registry-7886847858-kdzp6" (UID: "e812ce5b-8977-4351-b49f-0cbf7496a798") : secret "image-registry-tls" not found Apr 17 11:18:43.451196 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.450987 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e812ce5b-8977-4351-b49f-0cbf7496a798-trusted-ca\") pod \"image-registry-7886847858-kdzp6\" (UID: \"e812ce5b-8977-4351-b49f-0cbf7496a798\") " pod="openshift-image-registry/image-registry-7886847858-kdzp6" Apr 17 11:18:43.451196 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.451017 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a2b7e351-8153-4e2a-b2c0-4127f4670018-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rkxx4\" (UID: \"a2b7e351-8153-4e2a-b2c0-4127f4670018\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rkxx4" Apr 17 11:18:43.451196 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.451084 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e812ce5b-8977-4351-b49f-0cbf7496a798-registry-certificates\") pod \"image-registry-7886847858-kdzp6\" (UID: \"e812ce5b-8977-4351-b49f-0cbf7496a798\") " pod="openshift-image-registry/image-registry-7886847858-kdzp6" Apr 17 11:18:43.451444 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.451415 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e812ce5b-8977-4351-b49f-0cbf7496a798-ca-trust-extracted\") pod \"image-registry-7886847858-kdzp6\" (UID: \"e812ce5b-8977-4351-b49f-0cbf7496a798\") " pod="openshift-image-registry/image-registry-7886847858-kdzp6" Apr 17 11:18:43.451834 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:43.451800 2581 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 11:18:43.452006 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.451982 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e812ce5b-8977-4351-b49f-0cbf7496a798-bound-sa-token\") pod \"image-registry-7886847858-kdzp6\" (UID: \"e812ce5b-8977-4351-b49f-0cbf7496a798\") " pod="openshift-image-registry/image-registry-7886847858-kdzp6" Apr 17 11:18:43.452221 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:43.452201 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2b7e351-8153-4e2a-b2c0-4127f4670018-samples-operator-tls podName:a2b7e351-8153-4e2a-b2c0-4127f4670018 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:43.952177842 +0000 UTC m=+132.894332062 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/a2b7e351-8153-4e2a-b2c0-4127f4670018-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-rkxx4" (UID: "a2b7e351-8153-4e2a-b2c0-4127f4670018") : secret "samples-operator-tls" not found Apr 17 11:18:43.452293 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.452272 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e812ce5b-8977-4351-b49f-0cbf7496a798-registry-certificates\") pod \"image-registry-7886847858-kdzp6\" (UID: \"e812ce5b-8977-4351-b49f-0cbf7496a798\") " pod="openshift-image-registry/image-registry-7886847858-kdzp6" Apr 17 11:18:43.454253 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.454223 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e812ce5b-8977-4351-b49f-0cbf7496a798-image-registry-private-configuration\") pod \"image-registry-7886847858-kdzp6\" (UID: \"e812ce5b-8977-4351-b49f-0cbf7496a798\") " pod="openshift-image-registry/image-registry-7886847858-kdzp6" Apr 17 11:18:43.455727 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.455705 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e812ce5b-8977-4351-b49f-0cbf7496a798-installation-pull-secrets\") pod \"image-registry-7886847858-kdzp6\" (UID: \"e812ce5b-8977-4351-b49f-0cbf7496a798\") " pod="openshift-image-registry/image-registry-7886847858-kdzp6" Apr 17 11:18:43.456549 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.456527 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e812ce5b-8977-4351-b49f-0cbf7496a798-trusted-ca\") pod \"image-registry-7886847858-kdzp6\" (UID: \"e812ce5b-8977-4351-b49f-0cbf7496a798\") " pod="openshift-image-registry/image-registry-7886847858-kdzp6" Apr 17 11:18:43.462262 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.462238 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwpmm\" (UniqueName: \"kubernetes.io/projected/a2b7e351-8153-4e2a-b2c0-4127f4670018-kube-api-access-hwpmm\") pod \"cluster-samples-operator-6dc5bdb6b4-rkxx4\" (UID: \"a2b7e351-8153-4e2a-b2c0-4127f4670018\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rkxx4" Apr 17 11:18:43.462484 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.462467 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf9h7\" (UniqueName: \"kubernetes.io/projected/e812ce5b-8977-4351-b49f-0cbf7496a798-kube-api-access-lf9h7\") pod \"image-registry-7886847858-kdzp6\" (UID: \"e812ce5b-8977-4351-b49f-0cbf7496a798\") " pod="openshift-image-registry/image-registry-7886847858-kdzp6" Apr 17 11:18:43.464076 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.464060 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e812ce5b-8977-4351-b49f-0cbf7496a798-bound-sa-token\") pod \"image-registry-7886847858-kdzp6\" (UID: \"e812ce5b-8977-4351-b49f-0cbf7496a798\") " pod="openshift-image-registry/image-registry-7886847858-kdzp6" Apr 17 11:18:43.754258 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.754208 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-metrics-certs\") pod \"router-default-5f8c5cdcb8-ts845\" (UID: \"d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707\") " pod="openshift-ingress/router-default-5f8c5cdcb8-ts845" Apr 17 11:18:43.754447 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.754298 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-service-ca-bundle\") pod \"router-default-5f8c5cdcb8-ts845\" (UID: \"d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707\") " pod="openshift-ingress/router-default-5f8c5cdcb8-ts845" Apr 17 11:18:43.754447 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:43.754371 2581 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 11:18:43.754565 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:43.754453 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-metrics-certs podName:d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:44.7544369 +0000 UTC m=+133.696591119 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-metrics-certs") pod "router-default-5f8c5cdcb8-ts845" (UID: "d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707") : secret "router-metrics-certs-default" not found Apr 17 11:18:43.754565 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:43.754467 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-service-ca-bundle podName:d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:44.754461177 +0000 UTC m=+133.696615396 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-service-ca-bundle") pod "router-default-5f8c5cdcb8-ts845" (UID: "d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707") : configmap references non-existent config key: service-ca.crt Apr 17 11:18:43.854997 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.854965 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/76699d2b-150e-4629-9c16-03548712a64f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zdr26\" (UID: \"76699d2b-150e-4629-9c16-03548712a64f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zdr26" Apr 17 11:18:43.855163 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:43.855080 2581 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 11:18:43.855163 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:43.855140 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76699d2b-150e-4629-9c16-03548712a64f-networking-console-plugin-cert podName:76699d2b-150e-4629-9c16-03548712a64f nodeName:}" failed. No retries permitted until 2026-04-17 11:18:44.855127492 +0000 UTC m=+133.797281710 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/76699d2b-150e-4629-9c16-03548712a64f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-zdr26" (UID: "76699d2b-150e-4629-9c16-03548712a64f") : secret "networking-console-plugin-cert" not found Apr 17 11:18:43.956213 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.956180 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e812ce5b-8977-4351-b49f-0cbf7496a798-registry-tls\") pod \"image-registry-7886847858-kdzp6\" (UID: \"e812ce5b-8977-4351-b49f-0cbf7496a798\") " pod="openshift-image-registry/image-registry-7886847858-kdzp6" Apr 17 11:18:43.956213 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:43.956227 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a2b7e351-8153-4e2a-b2c0-4127f4670018-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rkxx4\" (UID: \"a2b7e351-8153-4e2a-b2c0-4127f4670018\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rkxx4" Apr 17 11:18:43.956427 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:43.956328 2581 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 11:18:43.956427 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:43.956334 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:18:43.956427 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:43.956354 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7886847858-kdzp6: secret "image-registry-tls" not found Apr 17 11:18:43.956427 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:43.956387 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2b7e351-8153-4e2a-b2c0-4127f4670018-samples-operator-tls podName:a2b7e351-8153-4e2a-b2c0-4127f4670018 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:44.956373467 +0000 UTC m=+133.898527690 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/a2b7e351-8153-4e2a-b2c0-4127f4670018-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-rkxx4" (UID: "a2b7e351-8153-4e2a-b2c0-4127f4670018") : secret "samples-operator-tls" not found Apr 17 11:18:43.956427 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:43.956401 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e812ce5b-8977-4351-b49f-0cbf7496a798-registry-tls podName:e812ce5b-8977-4351-b49f-0cbf7496a798 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:44.956394486 +0000 UTC m=+133.898548705 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e812ce5b-8977-4351-b49f-0cbf7496a798-registry-tls") pod "image-registry-7886847858-kdzp6" (UID: "e812ce5b-8977-4351-b49f-0cbf7496a798") : secret "image-registry-tls" not found Apr 17 11:18:44.763041 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:44.763007 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-metrics-certs\") pod \"router-default-5f8c5cdcb8-ts845\" (UID: \"d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707\") " pod="openshift-ingress/router-default-5f8c5cdcb8-ts845" Apr 17 11:18:44.763405 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:44.763068 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-service-ca-bundle\") pod \"router-default-5f8c5cdcb8-ts845\" (UID: \"d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707\") " pod="openshift-ingress/router-default-5f8c5cdcb8-ts845" Apr 17 11:18:44.763405 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:44.763154 2581 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 11:18:44.763405 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:44.763183 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-service-ca-bundle podName:d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:46.763166091 +0000 UTC m=+135.705320314 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-service-ca-bundle") pod "router-default-5f8c5cdcb8-ts845" (UID: "d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707") : configmap references non-existent config key: service-ca.crt Apr 17 11:18:44.763405 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:44.763205 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-metrics-certs podName:d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:46.763198897 +0000 UTC m=+135.705353116 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-metrics-certs") pod "router-default-5f8c5cdcb8-ts845" (UID: "d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707") : secret "router-metrics-certs-default" not found Apr 17 11:18:44.864255 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:44.864218 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/76699d2b-150e-4629-9c16-03548712a64f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zdr26\" (UID: \"76699d2b-150e-4629-9c16-03548712a64f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zdr26" Apr 17 11:18:44.864421 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:44.864386 2581 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 11:18:44.864502 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:44.864475 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76699d2b-150e-4629-9c16-03548712a64f-networking-console-plugin-cert podName:76699d2b-150e-4629-9c16-03548712a64f nodeName:}" failed. No retries permitted until 2026-04-17 11:18:46.864454062 +0000 UTC m=+135.806608294 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/76699d2b-150e-4629-9c16-03548712a64f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-zdr26" (UID: "76699d2b-150e-4629-9c16-03548712a64f") : secret "networking-console-plugin-cert" not found Apr 17 11:18:44.965248 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:44.965207 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a2b7e351-8153-4e2a-b2c0-4127f4670018-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rkxx4\" (UID: \"a2b7e351-8153-4e2a-b2c0-4127f4670018\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rkxx4" Apr 17 11:18:44.965441 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:44.965379 2581 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 11:18:44.965441 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:44.965395 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e812ce5b-8977-4351-b49f-0cbf7496a798-registry-tls\") pod \"image-registry-7886847858-kdzp6\" (UID: \"e812ce5b-8977-4351-b49f-0cbf7496a798\") " pod="openshift-image-registry/image-registry-7886847858-kdzp6" Apr 17 11:18:44.965542 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:44.965451 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2b7e351-8153-4e2a-b2c0-4127f4670018-samples-operator-tls podName:a2b7e351-8153-4e2a-b2c0-4127f4670018 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:46.96543068 +0000 UTC m=+135.907584899 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/a2b7e351-8153-4e2a-b2c0-4127f4670018-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-rkxx4" (UID: "a2b7e351-8153-4e2a-b2c0-4127f4670018") : secret "samples-operator-tls" not found Apr 17 11:18:44.965542 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:44.965482 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:18:44.965542 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:44.965493 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7886847858-kdzp6: secret "image-registry-tls" not found Apr 17 11:18:44.965542 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:44.965532 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e812ce5b-8977-4351-b49f-0cbf7496a798-registry-tls podName:e812ce5b-8977-4351-b49f-0cbf7496a798 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:46.965521138 +0000 UTC m=+135.907675370 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e812ce5b-8977-4351-b49f-0cbf7496a798-registry-tls") pod "image-registry-7886847858-kdzp6" (UID: "e812ce5b-8977-4351-b49f-0cbf7496a798") : secret "image-registry-tls" not found Apr 17 11:18:46.780524 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:46.780473 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-service-ca-bundle\") pod \"router-default-5f8c5cdcb8-ts845\" (UID: \"d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707\") " pod="openshift-ingress/router-default-5f8c5cdcb8-ts845" Apr 17 11:18:46.780946 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:46.780602 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-metrics-certs\") pod \"router-default-5f8c5cdcb8-ts845\" (UID: \"d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707\") " pod="openshift-ingress/router-default-5f8c5cdcb8-ts845" Apr 17 11:18:46.780946 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:46.780653 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-service-ca-bundle podName:d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:50.780632426 +0000 UTC m=+139.722786665 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-service-ca-bundle") pod "router-default-5f8c5cdcb8-ts845" (UID: "d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707") : configmap references non-existent config key: service-ca.crt Apr 17 11:18:46.780946 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:46.780710 2581 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 11:18:46.780946 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:46.780749 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-metrics-certs podName:d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:50.780737437 +0000 UTC m=+139.722891657 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-metrics-certs") pod "router-default-5f8c5cdcb8-ts845" (UID: "d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707") : secret "router-metrics-certs-default" not found Apr 17 11:18:46.881205 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:46.881153 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/76699d2b-150e-4629-9c16-03548712a64f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zdr26\" (UID: \"76699d2b-150e-4629-9c16-03548712a64f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zdr26" Apr 17 11:18:46.881334 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:46.881318 2581 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 11:18:46.881395 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:46.881385 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76699d2b-150e-4629-9c16-03548712a64f-networking-console-plugin-cert podName:76699d2b-150e-4629-9c16-03548712a64f nodeName:}" failed. No retries permitted until 2026-04-17 11:18:50.88136767 +0000 UTC m=+139.823521891 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/76699d2b-150e-4629-9c16-03548712a64f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-zdr26" (UID: "76699d2b-150e-4629-9c16-03548712a64f") : secret "networking-console-plugin-cert" not found Apr 17 11:18:46.982252 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:46.982204 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e812ce5b-8977-4351-b49f-0cbf7496a798-registry-tls\") pod \"image-registry-7886847858-kdzp6\" (UID: \"e812ce5b-8977-4351-b49f-0cbf7496a798\") " pod="openshift-image-registry/image-registry-7886847858-kdzp6" Apr 17 11:18:46.982252 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:46.982262 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a2b7e351-8153-4e2a-b2c0-4127f4670018-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rkxx4\" (UID: \"a2b7e351-8153-4e2a-b2c0-4127f4670018\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rkxx4" Apr 17 11:18:46.982412 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:46.982357 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:18:46.982412 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:46.982376 2581 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 11:18:46.982412 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:46.982380 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7886847858-kdzp6: secret "image-registry-tls" not found Apr 17 11:18:46.982499 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:46.982436 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2b7e351-8153-4e2a-b2c0-4127f4670018-samples-operator-tls podName:a2b7e351-8153-4e2a-b2c0-4127f4670018 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:50.98241861 +0000 UTC m=+139.924572831 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/a2b7e351-8153-4e2a-b2c0-4127f4670018-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-rkxx4" (UID: "a2b7e351-8153-4e2a-b2c0-4127f4670018") : secret "samples-operator-tls" not found Apr 17 11:18:46.982499 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:46.982454 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e812ce5b-8977-4351-b49f-0cbf7496a798-registry-tls podName:e812ce5b-8977-4351-b49f-0cbf7496a798 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:50.982446156 +0000 UTC m=+139.924600374 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e812ce5b-8977-4351-b49f-0cbf7496a798-registry-tls") pod "image-registry-7886847858-kdzp6" (UID: "e812ce5b-8977-4351-b49f-0cbf7496a798") : secret "image-registry-tls" not found Apr 17 11:18:48.294433 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:48.294400 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-m5hsm"] Apr 17 11:18:48.298623 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:48.298606 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-m5hsm" Apr 17 11:18:48.301618 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:48.301596 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 17 11:18:48.302890 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:48.302873 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 17 11:18:48.302953 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:48.302890 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-r2jj8\"" Apr 17 11:18:48.308069 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:48.308047 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-m5hsm"] Apr 17 11:18:48.394899 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:48.394861 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqx5l\" (UniqueName: \"kubernetes.io/projected/4a477a56-2449-459a-8d09-6f1648b29153-kube-api-access-wqx5l\") pod \"migrator-74bb7799d9-m5hsm\" (UID: \"4a477a56-2449-459a-8d09-6f1648b29153\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-m5hsm" Apr 17 11:18:48.496097 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:48.496055 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqx5l\" (UniqueName: \"kubernetes.io/projected/4a477a56-2449-459a-8d09-6f1648b29153-kube-api-access-wqx5l\") pod \"migrator-74bb7799d9-m5hsm\" (UID: \"4a477a56-2449-459a-8d09-6f1648b29153\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-m5hsm" Apr 17 11:18:48.504963 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:48.504935 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqx5l\" (UniqueName: \"kubernetes.io/projected/4a477a56-2449-459a-8d09-6f1648b29153-kube-api-access-wqx5l\") pod \"migrator-74bb7799d9-m5hsm\" (UID: \"4a477a56-2449-459a-8d09-6f1648b29153\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-m5hsm" Apr 17 11:18:48.607485 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:48.607373 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-m5hsm" Apr 17 11:18:48.721705 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:48.721676 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-m5hsm"] Apr 17 11:18:48.724932 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:18:48.724900 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a477a56_2449_459a_8d09_6f1648b29153.slice/crio-f43b54ae36ae5a4cd7ac706c1c85507f72e059f21d91638a09b2fbc9a9670423 WatchSource:0}: Error finding container f43b54ae36ae5a4cd7ac706c1c85507f72e059f21d91638a09b2fbc9a9670423: Status 404 returned error can't find the container with id f43b54ae36ae5a4cd7ac706c1c85507f72e059f21d91638a09b2fbc9a9670423 Apr 17 11:18:49.051205 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:49.051172 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-m5hsm" event={"ID":"4a477a56-2449-459a-8d09-6f1648b29153","Type":"ContainerStarted","Data":"f43b54ae36ae5a4cd7ac706c1c85507f72e059f21d91638a09b2fbc9a9670423"} Apr 17 11:18:50.054523 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:50.054445 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-m5hsm" event={"ID":"4a477a56-2449-459a-8d09-6f1648b29153","Type":"ContainerStarted","Data":"51bb98d4e08fc16978c1044e09ff22e263c7065466c63e422cd9972ca6431873"} Apr 17 11:18:50.054523 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:50.054476 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-m5hsm" event={"ID":"4a477a56-2449-459a-8d09-6f1648b29153","Type":"ContainerStarted","Data":"8d01a4ab96648ee9870fcee3df90e7cdd708f5e63551193014b270839eaf5c07"} Apr 17 11:18:50.070644 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:50.070601 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-m5hsm" podStartSLOduration=1.120998159 podStartE2EDuration="2.070589873s" podCreationTimestamp="2026-04-17 11:18:48 +0000 UTC" firstStartedPulling="2026-04-17 11:18:48.726540452 +0000 UTC m=+137.668694672" lastFinishedPulling="2026-04-17 11:18:49.676132163 +0000 UTC m=+138.618286386" observedRunningTime="2026-04-17 11:18:50.068737107 +0000 UTC m=+139.010891377" watchObservedRunningTime="2026-04-17 11:18:50.070589873 +0000 UTC m=+139.012744105" Apr 17 11:18:50.816767 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:50.816728 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-service-ca-bundle\") pod \"router-default-5f8c5cdcb8-ts845\" (UID: \"d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707\") " pod="openshift-ingress/router-default-5f8c5cdcb8-ts845" Apr 17 11:18:50.816990 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:50.816853 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-metrics-certs\") pod \"router-default-5f8c5cdcb8-ts845\" (UID: \"d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707\") " pod="openshift-ingress/router-default-5f8c5cdcb8-ts845" Apr 17 11:18:50.816990 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:50.816917 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-service-ca-bundle podName:d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:58.816898929 +0000 UTC m=+147.759053152 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-service-ca-bundle") pod "router-default-5f8c5cdcb8-ts845" (UID: "d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707") : configmap references non-existent config key: service-ca.crt Apr 17 11:18:50.816990 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:50.816981 2581 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 11:18:50.817124 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:50.817032 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-metrics-certs podName:d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:58.817020913 +0000 UTC m=+147.759175132 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-metrics-certs") pod "router-default-5f8c5cdcb8-ts845" (UID: "d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707") : secret "router-metrics-certs-default" not found Apr 17 11:18:50.917810 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:50.917771 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/76699d2b-150e-4629-9c16-03548712a64f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zdr26\" (UID: \"76699d2b-150e-4629-9c16-03548712a64f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zdr26" Apr 17 11:18:50.918028 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:50.917946 2581 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 11:18:50.918094 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:50.918029 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76699d2b-150e-4629-9c16-03548712a64f-networking-console-plugin-cert podName:76699d2b-150e-4629-9c16-03548712a64f nodeName:}" failed. No retries permitted until 2026-04-17 11:18:58.918007767 +0000 UTC m=+147.860162000 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/76699d2b-150e-4629-9c16-03548712a64f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-zdr26" (UID: "76699d2b-150e-4629-9c16-03548712a64f") : secret "networking-console-plugin-cert" not found Apr 17 11:18:51.018216 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:51.018174 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e812ce5b-8977-4351-b49f-0cbf7496a798-registry-tls\") pod \"image-registry-7886847858-kdzp6\" (UID: \"e812ce5b-8977-4351-b49f-0cbf7496a798\") " pod="openshift-image-registry/image-registry-7886847858-kdzp6" Apr 17 11:18:51.018402 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:51.018225 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a2b7e351-8153-4e2a-b2c0-4127f4670018-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rkxx4\" (UID: \"a2b7e351-8153-4e2a-b2c0-4127f4670018\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rkxx4" Apr 17 11:18:51.018402 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:51.018328 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:18:51.018402 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:51.018344 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7886847858-kdzp6: secret "image-registry-tls" not found Apr 17 11:18:51.018402 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:51.018400 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e812ce5b-8977-4351-b49f-0cbf7496a798-registry-tls podName:e812ce5b-8977-4351-b49f-0cbf7496a798 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:59.018382712 +0000 UTC m=+147.960536939 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e812ce5b-8977-4351-b49f-0cbf7496a798-registry-tls") pod "image-registry-7886847858-kdzp6" (UID: "e812ce5b-8977-4351-b49f-0cbf7496a798") : secret "image-registry-tls" not found Apr 17 11:18:51.018580 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:51.018403 2581 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 11:18:51.018580 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:51.018468 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2b7e351-8153-4e2a-b2c0-4127f4670018-samples-operator-tls podName:a2b7e351-8153-4e2a-b2c0-4127f4670018 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:59.018450558 +0000 UTC m=+147.960604780 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/a2b7e351-8153-4e2a-b2c0-4127f4670018-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-rkxx4" (UID: "a2b7e351-8153-4e2a-b2c0-4127f4670018") : secret "samples-operator-tls" not found Apr 17 11:18:51.370520 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:51.370494 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-8mkn6_5ee7c1ba-cbb5-4b2b-b2f7-23d447b10b96/dns-node-resolver/0.log" Apr 17 11:18:52.572019 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:52.571990 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-sdhv4_3b8f2274-31f6-413e-944a-132f5e8db8f6/node-ca/0.log" Apr 17 11:18:53.572389 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:53.572360 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-m5hsm_4a477a56-2449-459a-8d09-6f1648b29153/migrator/0.log" Apr 17 11:18:53.773693 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:53.773665 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-m5hsm_4a477a56-2449-459a-8d09-6f1648b29153/graceful-termination/0.log" Apr 17 11:18:58.883995 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:58.883940 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-service-ca-bundle\") pod \"router-default-5f8c5cdcb8-ts845\" (UID: \"d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707\") " pod="openshift-ingress/router-default-5f8c5cdcb8-ts845" Apr 17 11:18:58.884480 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:58.884047 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-metrics-certs\") pod \"router-default-5f8c5cdcb8-ts845\" (UID: \"d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707\") " pod="openshift-ingress/router-default-5f8c5cdcb8-ts845" Apr 17 11:18:58.884480 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:58.884135 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-service-ca-bundle podName:d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707 nodeName:}" failed. No retries permitted until 2026-04-17 11:19:14.884117093 +0000 UTC m=+163.826271314 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-service-ca-bundle") pod "router-default-5f8c5cdcb8-ts845" (UID: "d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707") : configmap references non-existent config key: service-ca.crt Apr 17 11:18:58.886435 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:58.886408 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-metrics-certs\") pod \"router-default-5f8c5cdcb8-ts845\" (UID: \"d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707\") " pod="openshift-ingress/router-default-5f8c5cdcb8-ts845" Apr 17 11:18:58.984989 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:58.984925 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/76699d2b-150e-4629-9c16-03548712a64f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zdr26\" (UID: \"76699d2b-150e-4629-9c16-03548712a64f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zdr26" Apr 17 11:18:58.985206 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:58.985118 2581 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 11:18:58.985206 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:18:58.985201 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76699d2b-150e-4629-9c16-03548712a64f-networking-console-plugin-cert podName:76699d2b-150e-4629-9c16-03548712a64f nodeName:}" failed. No retries permitted until 2026-04-17 11:19:14.985180589 +0000 UTC m=+163.927334808 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/76699d2b-150e-4629-9c16-03548712a64f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-zdr26" (UID: "76699d2b-150e-4629-9c16-03548712a64f") : secret "networking-console-plugin-cert" not found Apr 17 11:18:59.086336 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:59.086304 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e812ce5b-8977-4351-b49f-0cbf7496a798-registry-tls\") pod \"image-registry-7886847858-kdzp6\" (UID: \"e812ce5b-8977-4351-b49f-0cbf7496a798\") " pod="openshift-image-registry/image-registry-7886847858-kdzp6" Apr 17 11:18:59.086541 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:59.086345 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a2b7e351-8153-4e2a-b2c0-4127f4670018-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rkxx4\" (UID: \"a2b7e351-8153-4e2a-b2c0-4127f4670018\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rkxx4" Apr 17 11:18:59.088814 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:59.088784 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e812ce5b-8977-4351-b49f-0cbf7496a798-registry-tls\") pod \"image-registry-7886847858-kdzp6\" (UID: \"e812ce5b-8977-4351-b49f-0cbf7496a798\") " pod="openshift-image-registry/image-registry-7886847858-kdzp6" Apr 17 11:18:59.088935 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:59.088886 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a2b7e351-8153-4e2a-b2c0-4127f4670018-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rkxx4\" (UID: \"a2b7e351-8153-4e2a-b2c0-4127f4670018\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rkxx4" Apr 17 11:18:59.115934 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:59.115899 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rkxx4" Apr 17 11:18:59.166753 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:59.166702 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7886847858-kdzp6" Apr 17 11:18:59.238116 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:59.238079 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rkxx4"] Apr 17 11:18:59.291258 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:18:59.291233 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7886847858-kdzp6"] Apr 17 11:18:59.295238 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:18:59.295208 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode812ce5b_8977_4351_b49f_0cbf7496a798.slice/crio-8170b217f19348df6f13700bd7ce319615326ad14878a638af4efa933a8a1024 WatchSource:0}: Error finding container 8170b217f19348df6f13700bd7ce319615326ad14878a638af4efa933a8a1024: Status 404 returned error can't find the container with id 8170b217f19348df6f13700bd7ce319615326ad14878a638af4efa933a8a1024 Apr 17 11:19:00.076386 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:00.076344 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rkxx4" event={"ID":"a2b7e351-8153-4e2a-b2c0-4127f4670018","Type":"ContainerStarted","Data":"86dd13421903b61cec50ff19922b7b7c1379b0d811fee3807c7bb2cd4b3c93bc"} Apr 17 11:19:00.077487 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:00.077464 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7886847858-kdzp6" event={"ID":"e812ce5b-8977-4351-b49f-0cbf7496a798","Type":"ContainerStarted","Data":"06d76ea6314ec8e526e7f55535c0526a7ea21dd97ddfa0ba1706b8e6bcc7a0b7"} Apr 17 11:19:00.077619 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:00.077498 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7886847858-kdzp6" event={"ID":"e812ce5b-8977-4351-b49f-0cbf7496a798","Type":"ContainerStarted","Data":"8170b217f19348df6f13700bd7ce319615326ad14878a638af4efa933a8a1024"} Apr 17 11:19:00.077892 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:00.077859 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7886847858-kdzp6" Apr 17 11:19:00.097131 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:00.097089 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7886847858-kdzp6" podStartSLOduration=17.09707533 podStartE2EDuration="17.09707533s" podCreationTimestamp="2026-04-17 11:18:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:19:00.096559271 +0000 UTC m=+149.038713513" watchObservedRunningTime="2026-04-17 11:19:00.09707533 +0000 UTC m=+149.039229570" Apr 17 11:19:01.082891 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:01.082846 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rkxx4" event={"ID":"a2b7e351-8153-4e2a-b2c0-4127f4670018","Type":"ContainerStarted","Data":"064f7cedbdd96e8e05e61725bb2ce28e83caf30cb9d594e96e74b701d7faee04"} Apr 17 11:19:01.082891 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:01.082886 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rkxx4" event={"ID":"a2b7e351-8153-4e2a-b2c0-4127f4670018","Type":"ContainerStarted","Data":"3c6892928df3cf9b870ba075a6659990149a651a49d5806a6dde8d9a0b2de9a8"} Apr 17 11:19:01.100027 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:01.099981 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rkxx4" podStartSLOduration=16.49984308 podStartE2EDuration="18.099966915s" podCreationTimestamp="2026-04-17 11:18:43 +0000 UTC" firstStartedPulling="2026-04-17 11:18:59.295946924 +0000 UTC m=+148.238101143" lastFinishedPulling="2026-04-17 11:19:00.896070757 +0000 UTC m=+149.838224978" observedRunningTime="2026-04-17 11:19:01.099254922 +0000 UTC m=+150.041409163" watchObservedRunningTime="2026-04-17 11:19:01.099966915 +0000 UTC m=+150.042121192" Apr 17 11:19:07.466568 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:19:07.466513 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-7wbxp" podUID="8fe4a1b8-871d-4f96-95f0-946d542180da" Apr 17 11:19:07.496985 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:19:07.496953 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-rpkz7" podUID="2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3" Apr 17 11:19:07.684961 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:19:07.684909 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-9k5nh" podUID="bee0bc88-7732-4010-9886-3df7384bf1c8" Apr 17 11:19:08.097918 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:08.097889 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7wbxp" Apr 17 11:19:10.599432 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:10.599397 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7886847858-kdzp6"] Apr 17 11:19:10.622083 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:10.622046 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-wpkcf"] Apr 17 11:19:10.626616 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:10.626595 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-wpkcf" Apr 17 11:19:10.629142 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:10.629120 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-5zvgq\"" Apr 17 11:19:10.629284 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:10.629241 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 11:19:10.629284 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:10.629241 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 11:19:10.629463 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:10.629448 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 11:19:10.629553 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:10.629539 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 11:19:10.643065 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:10.643042 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-wpkcf"] Apr 17 11:19:10.678625 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:10.678593 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9mcj\" (UniqueName: \"kubernetes.io/projected/5870bfff-e1f4-4430-8513-d97e5bb1dcaa-kube-api-access-x9mcj\") pod \"insights-runtime-extractor-wpkcf\" (UID: \"5870bfff-e1f4-4430-8513-d97e5bb1dcaa\") " pod="openshift-insights/insights-runtime-extractor-wpkcf" Apr 17 11:19:10.678625 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:10.678626 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5870bfff-e1f4-4430-8513-d97e5bb1dcaa-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wpkcf\" (UID: \"5870bfff-e1f4-4430-8513-d97e5bb1dcaa\") " pod="openshift-insights/insights-runtime-extractor-wpkcf" Apr 17 11:19:10.678846 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:10.678654 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5870bfff-e1f4-4430-8513-d97e5bb1dcaa-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wpkcf\" (UID: \"5870bfff-e1f4-4430-8513-d97e5bb1dcaa\") " pod="openshift-insights/insights-runtime-extractor-wpkcf" Apr 17 11:19:10.678846 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:10.678760 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5870bfff-e1f4-4430-8513-d97e5bb1dcaa-crio-socket\") pod \"insights-runtime-extractor-wpkcf\" (UID: \"5870bfff-e1f4-4430-8513-d97e5bb1dcaa\") " pod="openshift-insights/insights-runtime-extractor-wpkcf" Apr 17 11:19:10.678846 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:10.678785 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5870bfff-e1f4-4430-8513-d97e5bb1dcaa-data-volume\") pod \"insights-runtime-extractor-wpkcf\" (UID: \"5870bfff-e1f4-4430-8513-d97e5bb1dcaa\") " pod="openshift-insights/insights-runtime-extractor-wpkcf" Apr 17 11:19:10.779991 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:10.779960 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5870bfff-e1f4-4430-8513-d97e5bb1dcaa-data-volume\") pod \"insights-runtime-extractor-wpkcf\" (UID: \"5870bfff-e1f4-4430-8513-d97e5bb1dcaa\") " pod="openshift-insights/insights-runtime-extractor-wpkcf" Apr 17 11:19:10.780202 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:10.780045 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9mcj\" (UniqueName: \"kubernetes.io/projected/5870bfff-e1f4-4430-8513-d97e5bb1dcaa-kube-api-access-x9mcj\") pod \"insights-runtime-extractor-wpkcf\" (UID: \"5870bfff-e1f4-4430-8513-d97e5bb1dcaa\") " pod="openshift-insights/insights-runtime-extractor-wpkcf" Apr 17 11:19:10.780202 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:10.780079 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5870bfff-e1f4-4430-8513-d97e5bb1dcaa-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wpkcf\" (UID: \"5870bfff-e1f4-4430-8513-d97e5bb1dcaa\") " pod="openshift-insights/insights-runtime-extractor-wpkcf" Apr 17 11:19:10.780202 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:10.780112 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5870bfff-e1f4-4430-8513-d97e5bb1dcaa-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wpkcf\" (UID: \"5870bfff-e1f4-4430-8513-d97e5bb1dcaa\") " pod="openshift-insights/insights-runtime-extractor-wpkcf" Apr 17 11:19:10.780202 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:10.780178 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5870bfff-e1f4-4430-8513-d97e5bb1dcaa-crio-socket\") pod \"insights-runtime-extractor-wpkcf\" (UID: \"5870bfff-e1f4-4430-8513-d97e5bb1dcaa\") " pod="openshift-insights/insights-runtime-extractor-wpkcf" Apr 17 11:19:10.780389 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:10.780273 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5870bfff-e1f4-4430-8513-d97e5bb1dcaa-crio-socket\") pod \"insights-runtime-extractor-wpkcf\" (UID: \"5870bfff-e1f4-4430-8513-d97e5bb1dcaa\") " pod="openshift-insights/insights-runtime-extractor-wpkcf" Apr 17 11:19:10.780389 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:10.780350 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5870bfff-e1f4-4430-8513-d97e5bb1dcaa-data-volume\") pod \"insights-runtime-extractor-wpkcf\" (UID: \"5870bfff-e1f4-4430-8513-d97e5bb1dcaa\") " pod="openshift-insights/insights-runtime-extractor-wpkcf" Apr 17 11:19:10.780700 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:10.780681 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5870bfff-e1f4-4430-8513-d97e5bb1dcaa-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wpkcf\" (UID: \"5870bfff-e1f4-4430-8513-d97e5bb1dcaa\") " pod="openshift-insights/insights-runtime-extractor-wpkcf" Apr 17 11:19:10.782443 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:10.782418 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5870bfff-e1f4-4430-8513-d97e5bb1dcaa-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wpkcf\" (UID: \"5870bfff-e1f4-4430-8513-d97e5bb1dcaa\") " pod="openshift-insights/insights-runtime-extractor-wpkcf" Apr 17 11:19:10.789266 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:10.789242 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9mcj\" (UniqueName: \"kubernetes.io/projected/5870bfff-e1f4-4430-8513-d97e5bb1dcaa-kube-api-access-x9mcj\") pod \"insights-runtime-extractor-wpkcf\" (UID: \"5870bfff-e1f4-4430-8513-d97e5bb1dcaa\") " pod="openshift-insights/insights-runtime-extractor-wpkcf" Apr 17 11:19:10.935502 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:10.935470 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-wpkcf" Apr 17 11:19:11.055412 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:11.055261 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-wpkcf"] Apr 17 11:19:11.057850 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:19:11.057809 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5870bfff_e1f4_4430_8513_d97e5bb1dcaa.slice/crio-e945f69f1a4aaf7db6ce3bca3cff548134a7791d9a99e8887adf72ef0d0a61f0 WatchSource:0}: Error finding container e945f69f1a4aaf7db6ce3bca3cff548134a7791d9a99e8887adf72ef0d0a61f0: Status 404 returned error can't find the container with id e945f69f1a4aaf7db6ce3bca3cff548134a7791d9a99e8887adf72ef0d0a61f0 Apr 17 11:19:11.105510 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:11.105477 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wpkcf" event={"ID":"5870bfff-e1f4-4430-8513-d97e5bb1dcaa","Type":"ContainerStarted","Data":"e945f69f1a4aaf7db6ce3bca3cff548134a7791d9a99e8887adf72ef0d0a61f0"} Apr 17 11:19:12.109867 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:12.109812 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wpkcf" event={"ID":"5870bfff-e1f4-4430-8513-d97e5bb1dcaa","Type":"ContainerStarted","Data":"0a51da5229c73e5e8ce33e8084b78ad9506a68188f8d4d6d1bc3f68358e86b74"} Apr 17 11:19:12.109867 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:12.109866 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wpkcf" event={"ID":"5870bfff-e1f4-4430-8513-d97e5bb1dcaa","Type":"ContainerStarted","Data":"3b266666e4a6ba9d81b70653977b612b24ab56f0be802748c322e5dac6a81674"} Apr 17 11:19:12.393867 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:12.393769 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3-metrics-tls\") pod \"dns-default-rpkz7\" (UID: \"2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3\") " pod="openshift-dns/dns-default-rpkz7" Apr 17 11:19:12.393867 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:12.393844 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fe4a1b8-871d-4f96-95f0-946d542180da-cert\") pod \"ingress-canary-7wbxp\" (UID: \"8fe4a1b8-871d-4f96-95f0-946d542180da\") " pod="openshift-ingress-canary/ingress-canary-7wbxp" Apr 17 11:19:12.396671 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:12.396647 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fe4a1b8-871d-4f96-95f0-946d542180da-cert\") pod \"ingress-canary-7wbxp\" (UID: \"8fe4a1b8-871d-4f96-95f0-946d542180da\") " pod="openshift-ingress-canary/ingress-canary-7wbxp" Apr 17 11:19:12.396833 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:12.396795 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3-metrics-tls\") pod \"dns-default-rpkz7\" (UID: \"2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3\") " pod="openshift-dns/dns-default-rpkz7" Apr 17 11:19:12.601355 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:12.601322 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-5xzqf\"" Apr 17 11:19:12.608793 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:12.608770 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7wbxp" Apr 17 11:19:13.134894 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:13.134870 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7wbxp"] Apr 17 11:19:13.137865 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:19:13.137836 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fe4a1b8_871d_4f96_95f0_946d542180da.slice/crio-c5c0b386cf57c4817fe331db48d921183865421867aaab0e6dffc7b99c648823 WatchSource:0}: Error finding container c5c0b386cf57c4817fe331db48d921183865421867aaab0e6dffc7b99c648823: Status 404 returned error can't find the container with id c5c0b386cf57c4817fe331db48d921183865421867aaab0e6dffc7b99c648823 Apr 17 11:19:14.117259 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:14.117217 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wpkcf" event={"ID":"5870bfff-e1f4-4430-8513-d97e5bb1dcaa","Type":"ContainerStarted","Data":"f717aab902e75a33ef4079358a6207c60cb3a9b396e70fc44f7e9ed02be466cc"} Apr 17 11:19:14.118410 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:14.118383 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7wbxp" event={"ID":"8fe4a1b8-871d-4f96-95f0-946d542180da","Type":"ContainerStarted","Data":"c5c0b386cf57c4817fe331db48d921183865421867aaab0e6dffc7b99c648823"} Apr 17 11:19:14.143714 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:14.143663 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-wpkcf" podStartSLOduration=2.200888643 podStartE2EDuration="4.143644326s" podCreationTimestamp="2026-04-17 11:19:10 +0000 UTC" firstStartedPulling="2026-04-17 11:19:11.110699884 +0000 UTC m=+160.052854103" lastFinishedPulling="2026-04-17 11:19:13.053455565 +0000 UTC m=+161.995609786" observedRunningTime="2026-04-17 11:19:14.142573889 +0000 UTC m=+163.084728129" watchObservedRunningTime="2026-04-17 11:19:14.143644326 +0000 UTC m=+163.085798586" Apr 17 11:19:14.915514 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:14.915473 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-service-ca-bundle\") pod \"router-default-5f8c5cdcb8-ts845\" (UID: \"d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707\") " pod="openshift-ingress/router-default-5f8c5cdcb8-ts845" Apr 17 11:19:14.916108 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:14.916086 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707-service-ca-bundle\") pod \"router-default-5f8c5cdcb8-ts845\" (UID: \"d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707\") " pod="openshift-ingress/router-default-5f8c5cdcb8-ts845" Apr 17 11:19:15.016841 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:15.016780 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/76699d2b-150e-4629-9c16-03548712a64f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zdr26\" (UID: \"76699d2b-150e-4629-9c16-03548712a64f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zdr26" Apr 17 11:19:15.019199 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:15.019167 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/76699d2b-150e-4629-9c16-03548712a64f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zdr26\" (UID: \"76699d2b-150e-4629-9c16-03548712a64f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zdr26" Apr 17 11:19:15.122373 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:15.122338 2581 generic.go:358] "Generic (PLEG): container finished" podID="5c8fd482-126f-4f85-ab60-3c44fd0c243e" containerID="04d5fbf8b2b1bffe3ce691dab16be52f8e565771da2f38c46f7a46300b86908a" exitCode=255 Apr 17 11:19:15.122553 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:15.122415 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6db7d65f79-r8wpx" event={"ID":"5c8fd482-126f-4f85-ab60-3c44fd0c243e","Type":"ContainerDied","Data":"04d5fbf8b2b1bffe3ce691dab16be52f8e565771da2f38c46f7a46300b86908a"} Apr 17 11:19:15.123724 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:15.123698 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7wbxp" event={"ID":"8fe4a1b8-871d-4f96-95f0-946d542180da","Type":"ContainerStarted","Data":"853e309cefac2a6946d0f84a090babf1aa5a9bd93d24a8aea8111bb88259cc01"} Apr 17 11:19:15.128986 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:15.128965 2581 scope.go:117] "RemoveContainer" containerID="04d5fbf8b2b1bffe3ce691dab16be52f8e565771da2f38c46f7a46300b86908a" Apr 17 11:19:15.159741 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:15.159691 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-7wbxp" podStartSLOduration=129.658909719 podStartE2EDuration="2m11.159673632s" podCreationTimestamp="2026-04-17 11:17:04 +0000 UTC" firstStartedPulling="2026-04-17 11:19:13.139795337 +0000 UTC m=+162.081949556" lastFinishedPulling="2026-04-17 11:19:14.640559237 +0000 UTC m=+163.582713469" observedRunningTime="2026-04-17 11:19:15.15876802 +0000 UTC m=+164.100922261" watchObservedRunningTime="2026-04-17 11:19:15.159673632 +0000 UTC m=+164.101827874" Apr 17 11:19:15.170353 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:15.170293 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5f8c5cdcb8-ts845" Apr 17 11:19:15.290143 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:15.290114 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-zdr26" Apr 17 11:19:15.307372 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:15.307343 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5f8c5cdcb8-ts845"] Apr 17 11:19:15.310659 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:19:15.310633 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd77ac6f7_5bf1_4c3d_88ac_46e8ef9f8707.slice/crio-a343f7948a135a31ae98be4bd4fef0da7e2bbd20ff161678c136b37e7f2c9b2d WatchSource:0}: Error finding container a343f7948a135a31ae98be4bd4fef0da7e2bbd20ff161678c136b37e7f2c9b2d: Status 404 returned error can't find the container with id a343f7948a135a31ae98be4bd4fef0da7e2bbd20ff161678c136b37e7f2c9b2d Apr 17 11:19:15.414413 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:15.413864 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-zdr26"] Apr 17 11:19:15.416788 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:19:15.416761 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76699d2b_150e_4629_9c16_03548712a64f.slice/crio-f476639f7bb82f4affc11661a483906c444d836463d6e7177eebabf4bd83b5ef WatchSource:0}: Error finding container f476639f7bb82f4affc11661a483906c444d836463d6e7177eebabf4bd83b5ef: Status 404 returned error can't find the container with id f476639f7bb82f4affc11661a483906c444d836463d6e7177eebabf4bd83b5ef Apr 17 11:19:16.127515 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:16.127482 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6db7d65f79-r8wpx" event={"ID":"5c8fd482-126f-4f85-ab60-3c44fd0c243e","Type":"ContainerStarted","Data":"f379353c7a11e423f59a1c3f3ce37d48a2b1d4b8a76c873a2d81cc10c9f94ab3"} Apr 17 11:19:16.128530 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:16.128500 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-zdr26" event={"ID":"76699d2b-150e-4629-9c16-03548712a64f","Type":"ContainerStarted","Data":"f476639f7bb82f4affc11661a483906c444d836463d6e7177eebabf4bd83b5ef"} Apr 17 11:19:16.129700 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:16.129669 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5f8c5cdcb8-ts845" event={"ID":"d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707","Type":"ContainerStarted","Data":"d5399a2bb23fb9eb00eafb9c78cc8d17559d8799f2ecc1784bde4f3adaddb3a1"} Apr 17 11:19:16.129801 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:16.129705 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5f8c5cdcb8-ts845" event={"ID":"d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707","Type":"ContainerStarted","Data":"a343f7948a135a31ae98be4bd4fef0da7e2bbd20ff161678c136b37e7f2c9b2d"} Apr 17 11:19:16.164625 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:16.164567 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5f8c5cdcb8-ts845" podStartSLOduration=33.164551534 podStartE2EDuration="33.164551534s" podCreationTimestamp="2026-04-17 11:18:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:19:16.164213575 +0000 UTC m=+165.106367841" watchObservedRunningTime="2026-04-17 11:19:16.164551534 +0000 UTC m=+165.106705774" Apr 17 11:19:16.170634 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:16.170609 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5f8c5cdcb8-ts845" Apr 17 11:19:16.173724 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:16.173684 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5f8c5cdcb8-ts845" Apr 17 11:19:17.134460 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:17.134420 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-zdr26" event={"ID":"76699d2b-150e-4629-9c16-03548712a64f","Type":"ContainerStarted","Data":"fc67b10ab6930d0de29063e47930628652c10b85f0136c5d15b48ff095db8568"} Apr 17 11:19:17.134639 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:17.134620 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-5f8c5cdcb8-ts845" Apr 17 11:19:17.135868 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:17.135849 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5f8c5cdcb8-ts845" Apr 17 11:19:17.160331 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:17.160289 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-zdr26" podStartSLOduration=33.194473682 podStartE2EDuration="34.160275461s" podCreationTimestamp="2026-04-17 11:18:43 +0000 UTC" firstStartedPulling="2026-04-17 11:19:15.41858038 +0000 UTC m=+164.360734600" lastFinishedPulling="2026-04-17 11:19:16.38438216 +0000 UTC m=+165.326536379" observedRunningTime="2026-04-17 11:19:17.159013303 +0000 UTC m=+166.101167543" watchObservedRunningTime="2026-04-17 11:19:17.160275461 +0000 UTC m=+166.102429699" Apr 17 11:19:18.657323 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:18.657272 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9k5nh" Apr 17 11:19:19.657383 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:19.657341 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rpkz7" Apr 17 11:19:19.660493 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:19.660471 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-v5zg9\"" Apr 17 11:19:19.667980 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:19.667957 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rpkz7" Apr 17 11:19:19.787810 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:19.787616 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rpkz7"] Apr 17 11:19:19.790440 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:19:19.790409 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ae9be6b_7bb8_4ea9_b282_621f3e3bb4a3.slice/crio-2becb65546e6a0d2fd0e659fd61620be0bc94087865943c1940ffb86f74770f7 WatchSource:0}: Error finding container 2becb65546e6a0d2fd0e659fd61620be0bc94087865943c1940ffb86f74770f7: Status 404 returned error can't find the container with id 2becb65546e6a0d2fd0e659fd61620be0bc94087865943c1940ffb86f74770f7 Apr 17 11:19:20.143768 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:20.143726 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rpkz7" event={"ID":"2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3","Type":"ContainerStarted","Data":"2becb65546e6a0d2fd0e659fd61620be0bc94087865943c1940ffb86f74770f7"} Apr 17 11:19:20.409583 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:20.409463 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-d6d28"] Apr 17 11:19:20.412734 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:20.412713 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-d6d28" Apr 17 11:19:20.416195 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:20.416168 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 11:19:20.417362 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:20.417336 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 11:19:20.417493 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:20.417473 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 11:19:20.417566 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:20.417501 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 17 11:19:20.417566 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:20.417558 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 17 11:19:20.417684 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:20.417654 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-x5mkb\"" Apr 17 11:19:20.424158 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:20.424134 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-d6d28"] Apr 17 11:19:20.458084 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:20.458038 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8s6d\" (UniqueName: \"kubernetes.io/projected/aa810299-6a4d-4ef5-b2b3-abe18f37385c-kube-api-access-z8s6d\") pod \"prometheus-operator-5676c8c784-d6d28\" (UID: \"aa810299-6a4d-4ef5-b2b3-abe18f37385c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-d6d28" Apr 17 11:19:20.458084 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:20.458086 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/aa810299-6a4d-4ef5-b2b3-abe18f37385c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-d6d28\" (UID: \"aa810299-6a4d-4ef5-b2b3-abe18f37385c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-d6d28" Apr 17 11:19:20.458338 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:20.458164 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/aa810299-6a4d-4ef5-b2b3-abe18f37385c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-d6d28\" (UID: \"aa810299-6a4d-4ef5-b2b3-abe18f37385c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-d6d28" Apr 17 11:19:20.458338 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:20.458273 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa810299-6a4d-4ef5-b2b3-abe18f37385c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-d6d28\" (UID: \"aa810299-6a4d-4ef5-b2b3-abe18f37385c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-d6d28" Apr 17 11:19:20.559194 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:20.559149 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8s6d\" (UniqueName: \"kubernetes.io/projected/aa810299-6a4d-4ef5-b2b3-abe18f37385c-kube-api-access-z8s6d\") pod \"prometheus-operator-5676c8c784-d6d28\" (UID: \"aa810299-6a4d-4ef5-b2b3-abe18f37385c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-d6d28" Apr 17 11:19:20.559386 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:20.559214 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/aa810299-6a4d-4ef5-b2b3-abe18f37385c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-d6d28\" (UID: \"aa810299-6a4d-4ef5-b2b3-abe18f37385c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-d6d28" Apr 17 11:19:20.559386 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:20.559260 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/aa810299-6a4d-4ef5-b2b3-abe18f37385c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-d6d28\" (UID: \"aa810299-6a4d-4ef5-b2b3-abe18f37385c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-d6d28" Apr 17 11:19:20.559386 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:20.559339 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa810299-6a4d-4ef5-b2b3-abe18f37385c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-d6d28\" (UID: \"aa810299-6a4d-4ef5-b2b3-abe18f37385c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-d6d28" Apr 17 11:19:20.560242 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:20.560215 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa810299-6a4d-4ef5-b2b3-abe18f37385c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-d6d28\" (UID: \"aa810299-6a4d-4ef5-b2b3-abe18f37385c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-d6d28" Apr 17 11:19:20.562051 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:20.562020 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/aa810299-6a4d-4ef5-b2b3-abe18f37385c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-d6d28\" (UID: \"aa810299-6a4d-4ef5-b2b3-abe18f37385c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-d6d28" Apr 17 11:19:20.562145 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:20.562044 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/aa810299-6a4d-4ef5-b2b3-abe18f37385c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-d6d28\" (UID: \"aa810299-6a4d-4ef5-b2b3-abe18f37385c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-d6d28" Apr 17 11:19:20.572208 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:20.572185 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8s6d\" (UniqueName: \"kubernetes.io/projected/aa810299-6a4d-4ef5-b2b3-abe18f37385c-kube-api-access-z8s6d\") pod \"prometheus-operator-5676c8c784-d6d28\" (UID: \"aa810299-6a4d-4ef5-b2b3-abe18f37385c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-d6d28" Apr 17 11:19:20.604552 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:20.604514 2581 patch_prober.go:28] interesting pod/image-registry-7886847858-kdzp6 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 11:19:20.604713 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:20.604586 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-7886847858-kdzp6" podUID="e812ce5b-8977-4351-b49f-0cbf7496a798" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 11:19:20.724046 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:20.723956 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-d6d28" Apr 17 11:19:21.089770 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:21.089740 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-d6d28"] Apr 17 11:19:21.092752 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:19:21.092714 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa810299_6a4d_4ef5_b2b3_abe18f37385c.slice/crio-f46acc73f45f7031430b052667413d6a72909d6f54e5d59d91381c760bb23d3f WatchSource:0}: Error finding container f46acc73f45f7031430b052667413d6a72909d6f54e5d59d91381c760bb23d3f: Status 404 returned error can't find the container with id f46acc73f45f7031430b052667413d6a72909d6f54e5d59d91381c760bb23d3f Apr 17 11:19:21.147425 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:21.147393 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rpkz7" event={"ID":"2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3","Type":"ContainerStarted","Data":"62540e3fa9c8b54706e597119cf00bec20d9b6099200f4de37e8fdb16c94606c"} Apr 17 11:19:21.148619 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:21.148596 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-d6d28" event={"ID":"aa810299-6a4d-4ef5-b2b3-abe18f37385c","Type":"ContainerStarted","Data":"f46acc73f45f7031430b052667413d6a72909d6f54e5d59d91381c760bb23d3f"} Apr 17 11:19:22.152887 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:22.152839 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rpkz7" event={"ID":"2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3","Type":"ContainerStarted","Data":"1c3e0976360dbefb4196af852cdee5d5fd6ad984539f88e67d3c3a0dc4a3ed51"} Apr 17 11:19:22.153333 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:22.152986 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-rpkz7" Apr 17 11:19:22.175144 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:22.175088 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-rpkz7" podStartSLOduration=136.959509262 podStartE2EDuration="2m18.175071289s" podCreationTimestamp="2026-04-17 11:17:04 +0000 UTC" firstStartedPulling="2026-04-17 11:19:19.792321168 +0000 UTC m=+168.734475387" lastFinishedPulling="2026-04-17 11:19:21.007883185 +0000 UTC m=+169.950037414" observedRunningTime="2026-04-17 11:19:22.174379087 +0000 UTC m=+171.116533328" watchObservedRunningTime="2026-04-17 11:19:22.175071289 +0000 UTC m=+171.117225523" Apr 17 11:19:23.156320 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:23.156280 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-d6d28" event={"ID":"aa810299-6a4d-4ef5-b2b3-abe18f37385c","Type":"ContainerStarted","Data":"5fc902dd442840c0d166b6dd1fda115703869c85f21bf5bc878c404b26cf39c7"} Apr 17 11:19:23.156320 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:23.156322 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-d6d28" event={"ID":"aa810299-6a4d-4ef5-b2b3-abe18f37385c","Type":"ContainerStarted","Data":"79f3c7630bb230498ae174748ce07d3428353b3b2ce4f8b0c20d39ca6fabf0b8"} Apr 17 11:19:23.182185 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:23.182122 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-d6d28" podStartSLOduration=2.052499957 podStartE2EDuration="3.182106203s" podCreationTimestamp="2026-04-17 11:19:20 +0000 UTC" firstStartedPulling="2026-04-17 11:19:21.094595033 +0000 UTC m=+170.036749251" lastFinishedPulling="2026-04-17 11:19:22.224201273 +0000 UTC m=+171.166355497" observedRunningTime="2026-04-17 11:19:23.181684668 +0000 UTC m=+172.123838921" watchObservedRunningTime="2026-04-17 11:19:23.182106203 +0000 UTC m=+172.124260440" Apr 17 11:19:24.889350 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:24.889318 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-lwtkb"] Apr 17 11:19:24.892528 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:24.892501 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-lwtkb" Apr 17 11:19:24.896780 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:24.896759 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-2grgd\"" Apr 17 11:19:24.897620 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:24.897594 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 11:19:24.897705 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:24.897668 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 11:19:24.897757 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:24.897729 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 11:19:24.993726 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:24.993691 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/33907f69-05ed-46d3-a4c7-b24d3165e1e7-metrics-client-ca\") pod \"node-exporter-lwtkb\" (UID: \"33907f69-05ed-46d3-a4c7-b24d3165e1e7\") " pod="openshift-monitoring/node-exporter-lwtkb" Apr 17 11:19:24.993726 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:24.993735 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/33907f69-05ed-46d3-a4c7-b24d3165e1e7-root\") pod \"node-exporter-lwtkb\" (UID: \"33907f69-05ed-46d3-a4c7-b24d3165e1e7\") " pod="openshift-monitoring/node-exporter-lwtkb" Apr 17 11:19:24.993962 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:24.993753 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/33907f69-05ed-46d3-a4c7-b24d3165e1e7-node-exporter-tls\") pod \"node-exporter-lwtkb\" (UID: \"33907f69-05ed-46d3-a4c7-b24d3165e1e7\") " pod="openshift-monitoring/node-exporter-lwtkb" Apr 17 11:19:24.993962 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:24.993877 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/33907f69-05ed-46d3-a4c7-b24d3165e1e7-node-exporter-wtmp\") pod \"node-exporter-lwtkb\" (UID: \"33907f69-05ed-46d3-a4c7-b24d3165e1e7\") " pod="openshift-monitoring/node-exporter-lwtkb" Apr 17 11:19:24.993962 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:24.993905 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/33907f69-05ed-46d3-a4c7-b24d3165e1e7-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lwtkb\" (UID: \"33907f69-05ed-46d3-a4c7-b24d3165e1e7\") " pod="openshift-monitoring/node-exporter-lwtkb" Apr 17 11:19:24.993962 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:24.993951 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/33907f69-05ed-46d3-a4c7-b24d3165e1e7-sys\") pod \"node-exporter-lwtkb\" (UID: \"33907f69-05ed-46d3-a4c7-b24d3165e1e7\") " pod="openshift-monitoring/node-exporter-lwtkb" Apr 17 11:19:24.994088 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:24.993972 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/33907f69-05ed-46d3-a4c7-b24d3165e1e7-node-exporter-textfile\") pod \"node-exporter-lwtkb\" (UID: \"33907f69-05ed-46d3-a4c7-b24d3165e1e7\") " pod="openshift-monitoring/node-exporter-lwtkb" Apr 17 11:19:24.994088 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:24.994045 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhmkm\" (UniqueName: \"kubernetes.io/projected/33907f69-05ed-46d3-a4c7-b24d3165e1e7-kube-api-access-vhmkm\") pod \"node-exporter-lwtkb\" (UID: \"33907f69-05ed-46d3-a4c7-b24d3165e1e7\") " pod="openshift-monitoring/node-exporter-lwtkb" Apr 17 11:19:24.994088 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:24.994070 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/33907f69-05ed-46d3-a4c7-b24d3165e1e7-node-exporter-accelerators-collector-config\") pod \"node-exporter-lwtkb\" (UID: \"33907f69-05ed-46d3-a4c7-b24d3165e1e7\") " pod="openshift-monitoring/node-exporter-lwtkb" Apr 17 11:19:25.095398 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:25.095360 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhmkm\" (UniqueName: \"kubernetes.io/projected/33907f69-05ed-46d3-a4c7-b24d3165e1e7-kube-api-access-vhmkm\") pod \"node-exporter-lwtkb\" (UID: \"33907f69-05ed-46d3-a4c7-b24d3165e1e7\") " pod="openshift-monitoring/node-exporter-lwtkb" Apr 17 11:19:25.095398 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:25.095398 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/33907f69-05ed-46d3-a4c7-b24d3165e1e7-node-exporter-accelerators-collector-config\") pod \"node-exporter-lwtkb\" (UID: \"33907f69-05ed-46d3-a4c7-b24d3165e1e7\") " pod="openshift-monitoring/node-exporter-lwtkb" Apr 17 11:19:25.095636 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:25.095523 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/33907f69-05ed-46d3-a4c7-b24d3165e1e7-metrics-client-ca\") pod \"node-exporter-lwtkb\" (UID: \"33907f69-05ed-46d3-a4c7-b24d3165e1e7\") " pod="openshift-monitoring/node-exporter-lwtkb" Apr 17 11:19:25.095636 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:25.095578 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/33907f69-05ed-46d3-a4c7-b24d3165e1e7-root\") pod \"node-exporter-lwtkb\" (UID: \"33907f69-05ed-46d3-a4c7-b24d3165e1e7\") " pod="openshift-monitoring/node-exporter-lwtkb" Apr 17 11:19:25.095636 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:25.095603 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/33907f69-05ed-46d3-a4c7-b24d3165e1e7-node-exporter-tls\") pod \"node-exporter-lwtkb\" (UID: \"33907f69-05ed-46d3-a4c7-b24d3165e1e7\") " pod="openshift-monitoring/node-exporter-lwtkb" Apr 17 11:19:25.095782 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:25.095633 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/33907f69-05ed-46d3-a4c7-b24d3165e1e7-root\") pod \"node-exporter-lwtkb\" (UID: \"33907f69-05ed-46d3-a4c7-b24d3165e1e7\") " pod="openshift-monitoring/node-exporter-lwtkb" Apr 17 11:19:25.095782 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:25.095654 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/33907f69-05ed-46d3-a4c7-b24d3165e1e7-node-exporter-wtmp\") pod \"node-exporter-lwtkb\" (UID: \"33907f69-05ed-46d3-a4c7-b24d3165e1e7\") " pod="openshift-monitoring/node-exporter-lwtkb" Apr 17 11:19:25.095782 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:19:25.095728 2581 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 11:19:25.095966 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:19:25.095798 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33907f69-05ed-46d3-a4c7-b24d3165e1e7-node-exporter-tls podName:33907f69-05ed-46d3-a4c7-b24d3165e1e7 nodeName:}" failed. No retries permitted until 2026-04-17 11:19:25.595777356 +0000 UTC m=+174.537931575 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/33907f69-05ed-46d3-a4c7-b24d3165e1e7-node-exporter-tls") pod "node-exporter-lwtkb" (UID: "33907f69-05ed-46d3-a4c7-b24d3165e1e7") : secret "node-exporter-tls" not found Apr 17 11:19:25.095966 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:25.095814 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/33907f69-05ed-46d3-a4c7-b24d3165e1e7-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lwtkb\" (UID: \"33907f69-05ed-46d3-a4c7-b24d3165e1e7\") " pod="openshift-monitoring/node-exporter-lwtkb" Apr 17 11:19:25.095966 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:25.095881 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/33907f69-05ed-46d3-a4c7-b24d3165e1e7-sys\") pod \"node-exporter-lwtkb\" (UID: \"33907f69-05ed-46d3-a4c7-b24d3165e1e7\") " pod="openshift-monitoring/node-exporter-lwtkb" Apr 17 11:19:25.095966 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:25.095897 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/33907f69-05ed-46d3-a4c7-b24d3165e1e7-node-exporter-textfile\") pod \"node-exporter-lwtkb\" (UID: \"33907f69-05ed-46d3-a4c7-b24d3165e1e7\") " pod="openshift-monitoring/node-exporter-lwtkb" Apr 17 11:19:25.095966 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:25.095841 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/33907f69-05ed-46d3-a4c7-b24d3165e1e7-node-exporter-wtmp\") pod \"node-exporter-lwtkb\" (UID: \"33907f69-05ed-46d3-a4c7-b24d3165e1e7\") " pod="openshift-monitoring/node-exporter-lwtkb" Apr 17 11:19:25.096208 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:25.095988 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/33907f69-05ed-46d3-a4c7-b24d3165e1e7-sys\") pod \"node-exporter-lwtkb\" (UID: \"33907f69-05ed-46d3-a4c7-b24d3165e1e7\") " pod="openshift-monitoring/node-exporter-lwtkb" Apr 17 11:19:25.096208 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:25.096035 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/33907f69-05ed-46d3-a4c7-b24d3165e1e7-node-exporter-accelerators-collector-config\") pod \"node-exporter-lwtkb\" (UID: \"33907f69-05ed-46d3-a4c7-b24d3165e1e7\") " pod="openshift-monitoring/node-exporter-lwtkb" Apr 17 11:19:25.096208 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:25.096078 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/33907f69-05ed-46d3-a4c7-b24d3165e1e7-metrics-client-ca\") pod \"node-exporter-lwtkb\" (UID: \"33907f69-05ed-46d3-a4c7-b24d3165e1e7\") " pod="openshift-monitoring/node-exporter-lwtkb" Apr 17 11:19:25.096318 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:25.096228 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/33907f69-05ed-46d3-a4c7-b24d3165e1e7-node-exporter-textfile\") pod \"node-exporter-lwtkb\" (UID: \"33907f69-05ed-46d3-a4c7-b24d3165e1e7\") " pod="openshift-monitoring/node-exporter-lwtkb" Apr 17 11:19:25.098080 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:25.098058 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/33907f69-05ed-46d3-a4c7-b24d3165e1e7-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lwtkb\" (UID: \"33907f69-05ed-46d3-a4c7-b24d3165e1e7\") " pod="openshift-monitoring/node-exporter-lwtkb" Apr 17 11:19:25.104535 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:25.104511 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhmkm\" (UniqueName: \"kubernetes.io/projected/33907f69-05ed-46d3-a4c7-b24d3165e1e7-kube-api-access-vhmkm\") pod \"node-exporter-lwtkb\" (UID: \"33907f69-05ed-46d3-a4c7-b24d3165e1e7\") " pod="openshift-monitoring/node-exporter-lwtkb" Apr 17 11:19:25.599287 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:25.599251 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/33907f69-05ed-46d3-a4c7-b24d3165e1e7-node-exporter-tls\") pod \"node-exporter-lwtkb\" (UID: \"33907f69-05ed-46d3-a4c7-b24d3165e1e7\") " pod="openshift-monitoring/node-exporter-lwtkb" Apr 17 11:19:25.601545 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:25.601520 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/33907f69-05ed-46d3-a4c7-b24d3165e1e7-node-exporter-tls\") pod \"node-exporter-lwtkb\" (UID: \"33907f69-05ed-46d3-a4c7-b24d3165e1e7\") " pod="openshift-monitoring/node-exporter-lwtkb" Apr 17 11:19:25.802446 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:25.802412 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-lwtkb" Apr 17 11:19:25.811999 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:19:25.811956 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33907f69_05ed_46d3_a4c7_b24d3165e1e7.slice/crio-063cc25b4e69020e1d614a8ffd14c1687c5545470fdb09cb1b1ffb35bf4ce4f3 WatchSource:0}: Error finding container 063cc25b4e69020e1d614a8ffd14c1687c5545470fdb09cb1b1ffb35bf4ce4f3: Status 404 returned error can't find the container with id 063cc25b4e69020e1d614a8ffd14c1687c5545470fdb09cb1b1ffb35bf4ce4f3 Apr 17 11:19:26.166158 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:26.166120 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lwtkb" event={"ID":"33907f69-05ed-46d3-a4c7-b24d3165e1e7","Type":"ContainerStarted","Data":"063cc25b4e69020e1d614a8ffd14c1687c5545470fdb09cb1b1ffb35bf4ce4f3"} Apr 17 11:19:27.169974 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:27.169935 2581 generic.go:358] "Generic (PLEG): container finished" podID="33907f69-05ed-46d3-a4c7-b24d3165e1e7" containerID="0cf4ea6bc79b66b5563fb0ab67bb7d6b869429f782669665aaa8a1942eeec109" exitCode=0 Apr 17 11:19:27.169974 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:27.169978 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lwtkb" event={"ID":"33907f69-05ed-46d3-a4c7-b24d3165e1e7","Type":"ContainerDied","Data":"0cf4ea6bc79b66b5563fb0ab67bb7d6b869429f782669665aaa8a1942eeec109"} Apr 17 11:19:27.900942 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:27.900902 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-dccb875d5-dgb9v"] Apr 17 11:19:27.904390 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:27.904373 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-dccb875d5-dgb9v" Apr 17 11:19:27.907879 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:27.907834 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 17 11:19:27.908044 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:27.907953 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-bpqqeiqf7mtks\"" Apr 17 11:19:27.908044 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:27.908036 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 17 11:19:27.908165 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:27.908104 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 17 11:19:27.908325 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:27.908309 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 17 11:19:27.908664 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:27.908645 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-hwlr9\"" Apr 17 11:19:27.908746 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:27.908712 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 17 11:19:27.920393 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:27.920373 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-dccb875d5-dgb9v"] Apr 17 11:19:28.017746 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:28.017698 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxlk8\" (UniqueName: \"kubernetes.io/projected/4121857a-9f7b-48f9-83c8-b78509c1d47f-kube-api-access-zxlk8\") pod \"thanos-querier-dccb875d5-dgb9v\" (UID: \"4121857a-9f7b-48f9-83c8-b78509c1d47f\") " pod="openshift-monitoring/thanos-querier-dccb875d5-dgb9v" Apr 17 11:19:28.017984 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:28.017794 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/4121857a-9f7b-48f9-83c8-b78509c1d47f-secret-thanos-querier-tls\") pod \"thanos-querier-dccb875d5-dgb9v\" (UID: \"4121857a-9f7b-48f9-83c8-b78509c1d47f\") " pod="openshift-monitoring/thanos-querier-dccb875d5-dgb9v" Apr 17 11:19:28.017984 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:28.017844 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4121857a-9f7b-48f9-83c8-b78509c1d47f-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-dccb875d5-dgb9v\" (UID: \"4121857a-9f7b-48f9-83c8-b78509c1d47f\") " pod="openshift-monitoring/thanos-querier-dccb875d5-dgb9v" Apr 17 11:19:28.017984 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:28.017896 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4121857a-9f7b-48f9-83c8-b78509c1d47f-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-dccb875d5-dgb9v\" (UID: \"4121857a-9f7b-48f9-83c8-b78509c1d47f\") " pod="openshift-monitoring/thanos-querier-dccb875d5-dgb9v" Apr 17 11:19:28.017984 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:28.017923 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/4121857a-9f7b-48f9-83c8-b78509c1d47f-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-dccb875d5-dgb9v\" (UID: \"4121857a-9f7b-48f9-83c8-b78509c1d47f\") " pod="openshift-monitoring/thanos-querier-dccb875d5-dgb9v" Apr 17 11:19:28.017984 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:28.017965 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4121857a-9f7b-48f9-83c8-b78509c1d47f-secret-grpc-tls\") pod \"thanos-querier-dccb875d5-dgb9v\" (UID: \"4121857a-9f7b-48f9-83c8-b78509c1d47f\") " pod="openshift-monitoring/thanos-querier-dccb875d5-dgb9v" Apr 17 11:19:28.018158 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:28.018007 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/4121857a-9f7b-48f9-83c8-b78509c1d47f-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-dccb875d5-dgb9v\" (UID: \"4121857a-9f7b-48f9-83c8-b78509c1d47f\") " pod="openshift-monitoring/thanos-querier-dccb875d5-dgb9v" Apr 17 11:19:28.018158 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:28.018028 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4121857a-9f7b-48f9-83c8-b78509c1d47f-metrics-client-ca\") pod \"thanos-querier-dccb875d5-dgb9v\" (UID: \"4121857a-9f7b-48f9-83c8-b78509c1d47f\") " pod="openshift-monitoring/thanos-querier-dccb875d5-dgb9v" Apr 17 11:19:28.118784 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:28.118737 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4121857a-9f7b-48f9-83c8-b78509c1d47f-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-dccb875d5-dgb9v\" (UID: \"4121857a-9f7b-48f9-83c8-b78509c1d47f\") " pod="openshift-monitoring/thanos-querier-dccb875d5-dgb9v" Apr 17 11:19:28.118973 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:28.118790 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/4121857a-9f7b-48f9-83c8-b78509c1d47f-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-dccb875d5-dgb9v\" (UID: \"4121857a-9f7b-48f9-83c8-b78509c1d47f\") " pod="openshift-monitoring/thanos-querier-dccb875d5-dgb9v" Apr 17 11:19:28.118973 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:28.118845 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4121857a-9f7b-48f9-83c8-b78509c1d47f-secret-grpc-tls\") pod \"thanos-querier-dccb875d5-dgb9v\" (UID: \"4121857a-9f7b-48f9-83c8-b78509c1d47f\") " pod="openshift-monitoring/thanos-querier-dccb875d5-dgb9v" Apr 17 11:19:28.118973 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:28.118903 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/4121857a-9f7b-48f9-83c8-b78509c1d47f-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-dccb875d5-dgb9v\" (UID: \"4121857a-9f7b-48f9-83c8-b78509c1d47f\") " pod="openshift-monitoring/thanos-querier-dccb875d5-dgb9v" Apr 17 11:19:28.119148 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:28.119073 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4121857a-9f7b-48f9-83c8-b78509c1d47f-metrics-client-ca\") pod \"thanos-querier-dccb875d5-dgb9v\" (UID: \"4121857a-9f7b-48f9-83c8-b78509c1d47f\") " pod="openshift-monitoring/thanos-querier-dccb875d5-dgb9v" Apr 17 11:19:28.119148 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:28.119117 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxlk8\" (UniqueName: \"kubernetes.io/projected/4121857a-9f7b-48f9-83c8-b78509c1d47f-kube-api-access-zxlk8\") pod \"thanos-querier-dccb875d5-dgb9v\" (UID: \"4121857a-9f7b-48f9-83c8-b78509c1d47f\") " pod="openshift-monitoring/thanos-querier-dccb875d5-dgb9v" Apr 17 11:19:28.119255 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:28.119169 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/4121857a-9f7b-48f9-83c8-b78509c1d47f-secret-thanos-querier-tls\") pod \"thanos-querier-dccb875d5-dgb9v\" (UID: \"4121857a-9f7b-48f9-83c8-b78509c1d47f\") " pod="openshift-monitoring/thanos-querier-dccb875d5-dgb9v" Apr 17 11:19:28.119255 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:28.119189 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4121857a-9f7b-48f9-83c8-b78509c1d47f-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-dccb875d5-dgb9v\" (UID: \"4121857a-9f7b-48f9-83c8-b78509c1d47f\") " pod="openshift-monitoring/thanos-querier-dccb875d5-dgb9v" Apr 17 11:19:28.119867 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:28.119836 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4121857a-9f7b-48f9-83c8-b78509c1d47f-metrics-client-ca\") pod \"thanos-querier-dccb875d5-dgb9v\" (UID: \"4121857a-9f7b-48f9-83c8-b78509c1d47f\") " pod="openshift-monitoring/thanos-querier-dccb875d5-dgb9v" Apr 17 11:19:28.121566 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:28.121501 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4121857a-9f7b-48f9-83c8-b78509c1d47f-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-dccb875d5-dgb9v\" (UID: \"4121857a-9f7b-48f9-83c8-b78509c1d47f\") " pod="openshift-monitoring/thanos-querier-dccb875d5-dgb9v" Apr 17 11:19:28.121757 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:28.121736 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/4121857a-9f7b-48f9-83c8-b78509c1d47f-secret-thanos-querier-tls\") pod \"thanos-querier-dccb875d5-dgb9v\" (UID: \"4121857a-9f7b-48f9-83c8-b78509c1d47f\") " pod="openshift-monitoring/thanos-querier-dccb875d5-dgb9v" Apr 17 11:19:28.121945 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:28.121924 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/4121857a-9f7b-48f9-83c8-b78509c1d47f-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-dccb875d5-dgb9v\" (UID: \"4121857a-9f7b-48f9-83c8-b78509c1d47f\") " pod="openshift-monitoring/thanos-querier-dccb875d5-dgb9v" Apr 17 11:19:28.121986 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:28.121949 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/4121857a-9f7b-48f9-83c8-b78509c1d47f-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-dccb875d5-dgb9v\" (UID: \"4121857a-9f7b-48f9-83c8-b78509c1d47f\") " pod="openshift-monitoring/thanos-querier-dccb875d5-dgb9v" Apr 17 11:19:28.121986 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:28.121970 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4121857a-9f7b-48f9-83c8-b78509c1d47f-secret-grpc-tls\") pod \"thanos-querier-dccb875d5-dgb9v\" (UID: \"4121857a-9f7b-48f9-83c8-b78509c1d47f\") " pod="openshift-monitoring/thanos-querier-dccb875d5-dgb9v" Apr 17 11:19:28.122053 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:28.121995 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4121857a-9f7b-48f9-83c8-b78509c1d47f-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-dccb875d5-dgb9v\" (UID: \"4121857a-9f7b-48f9-83c8-b78509c1d47f\") " pod="openshift-monitoring/thanos-querier-dccb875d5-dgb9v" Apr 17 11:19:28.132196 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:28.132173 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxlk8\" (UniqueName: \"kubernetes.io/projected/4121857a-9f7b-48f9-83c8-b78509c1d47f-kube-api-access-zxlk8\") pod \"thanos-querier-dccb875d5-dgb9v\" (UID: \"4121857a-9f7b-48f9-83c8-b78509c1d47f\") " pod="openshift-monitoring/thanos-querier-dccb875d5-dgb9v" Apr 17 11:19:28.175583 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:28.175494 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lwtkb" event={"ID":"33907f69-05ed-46d3-a4c7-b24d3165e1e7","Type":"ContainerStarted","Data":"9ef43336df01e1b76fa0174676d0ea6985d3cd6efb3d748888567ee9be760814"} Apr 17 11:19:28.175583 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:28.175538 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lwtkb" event={"ID":"33907f69-05ed-46d3-a4c7-b24d3165e1e7","Type":"ContainerStarted","Data":"6bc916bfe35c7f2a374de9f53178f43472f1526d604fb8e2cecb73c3475611ef"} Apr 17 11:19:28.193704 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:28.193641 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-lwtkb" podStartSLOduration=3.446493269 podStartE2EDuration="4.193624413s" podCreationTimestamp="2026-04-17 11:19:24 +0000 UTC" firstStartedPulling="2026-04-17 11:19:25.813946391 +0000 UTC m=+174.756100625" lastFinishedPulling="2026-04-17 11:19:26.561077536 +0000 UTC m=+175.503231769" observedRunningTime="2026-04-17 11:19:28.193257804 +0000 UTC m=+177.135412048" watchObservedRunningTime="2026-04-17 11:19:28.193624413 +0000 UTC m=+177.135778648" Apr 17 11:19:28.213387 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:28.213359 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-dccb875d5-dgb9v" Apr 17 11:19:28.338178 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:28.338145 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-dccb875d5-dgb9v"] Apr 17 11:19:28.342241 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:19:28.342213 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4121857a_9f7b_48f9_83c8_b78509c1d47f.slice/crio-a4fe5447c7145aac66ebe06417dc4eed4507b83657b59484639efa271d925483 WatchSource:0}: Error finding container a4fe5447c7145aac66ebe06417dc4eed4507b83657b59484639efa271d925483: Status 404 returned error can't find the container with id a4fe5447c7145aac66ebe06417dc4eed4507b83657b59484639efa271d925483 Apr 17 11:19:29.179847 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.179794 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-dccb875d5-dgb9v" event={"ID":"4121857a-9f7b-48f9-83c8-b78509c1d47f","Type":"ContainerStarted","Data":"a4fe5447c7145aac66ebe06417dc4eed4507b83657b59484639efa271d925483"} Apr 17 11:19:29.208261 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.208223 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-bff94c98c-8ffzl"] Apr 17 11:19:29.211409 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.211376 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-bff94c98c-8ffzl" Apr 17 11:19:29.215643 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.215616 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-z8wwp\"" Apr 17 11:19:29.215772 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.215644 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 11:19:29.215772 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.215657 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 17 11:19:29.215772 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.215664 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 17 11:19:29.215975 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.215796 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-7inqeji556n8u\"" Apr 17 11:19:29.215975 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.215930 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 17 11:19:29.223321 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.223298 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-bff94c98c-8ffzl"] Apr 17 11:19:29.328288 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.328251 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/73a3d378-3878-418a-9279-539c643fab5a-secret-metrics-server-tls\") pod \"metrics-server-bff94c98c-8ffzl\" (UID: \"73a3d378-3878-418a-9279-539c643fab5a\") " pod="openshift-monitoring/metrics-server-bff94c98c-8ffzl" Apr 17 11:19:29.328288 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.328296 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73a3d378-3878-418a-9279-539c643fab5a-client-ca-bundle\") pod \"metrics-server-bff94c98c-8ffzl\" (UID: \"73a3d378-3878-418a-9279-539c643fab5a\") " pod="openshift-monitoring/metrics-server-bff94c98c-8ffzl" Apr 17 11:19:29.328531 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.328414 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/73a3d378-3878-418a-9279-539c643fab5a-metrics-server-audit-profiles\") pod \"metrics-server-bff94c98c-8ffzl\" (UID: \"73a3d378-3878-418a-9279-539c643fab5a\") " pod="openshift-monitoring/metrics-server-bff94c98c-8ffzl" Apr 17 11:19:29.328531 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.328500 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73a3d378-3878-418a-9279-539c643fab5a-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-bff94c98c-8ffzl\" (UID: \"73a3d378-3878-418a-9279-539c643fab5a\") " pod="openshift-monitoring/metrics-server-bff94c98c-8ffzl" Apr 17 11:19:29.328531 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.328523 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-578cj\" (UniqueName: \"kubernetes.io/projected/73a3d378-3878-418a-9279-539c643fab5a-kube-api-access-578cj\") pod \"metrics-server-bff94c98c-8ffzl\" (UID: \"73a3d378-3878-418a-9279-539c643fab5a\") " pod="openshift-monitoring/metrics-server-bff94c98c-8ffzl" Apr 17 11:19:29.328788 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.328762 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/73a3d378-3878-418a-9279-539c643fab5a-audit-log\") pod \"metrics-server-bff94c98c-8ffzl\" (UID: \"73a3d378-3878-418a-9279-539c643fab5a\") " pod="openshift-monitoring/metrics-server-bff94c98c-8ffzl" Apr 17 11:19:29.328872 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.328848 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/73a3d378-3878-418a-9279-539c643fab5a-secret-metrics-server-client-certs\") pod \"metrics-server-bff94c98c-8ffzl\" (UID: \"73a3d378-3878-418a-9279-539c643fab5a\") " pod="openshift-monitoring/metrics-server-bff94c98c-8ffzl" Apr 17 11:19:29.429325 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.429287 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/73a3d378-3878-418a-9279-539c643fab5a-audit-log\") pod \"metrics-server-bff94c98c-8ffzl\" (UID: \"73a3d378-3878-418a-9279-539c643fab5a\") " pod="openshift-monitoring/metrics-server-bff94c98c-8ffzl" Apr 17 11:19:29.429513 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.429336 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/73a3d378-3878-418a-9279-539c643fab5a-secret-metrics-server-client-certs\") pod \"metrics-server-bff94c98c-8ffzl\" (UID: \"73a3d378-3878-418a-9279-539c643fab5a\") " pod="openshift-monitoring/metrics-server-bff94c98c-8ffzl" Apr 17 11:19:29.429513 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.429363 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/73a3d378-3878-418a-9279-539c643fab5a-secret-metrics-server-tls\") pod \"metrics-server-bff94c98c-8ffzl\" (UID: \"73a3d378-3878-418a-9279-539c643fab5a\") " pod="openshift-monitoring/metrics-server-bff94c98c-8ffzl" Apr 17 11:19:29.429513 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.429387 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73a3d378-3878-418a-9279-539c643fab5a-client-ca-bundle\") pod \"metrics-server-bff94c98c-8ffzl\" (UID: \"73a3d378-3878-418a-9279-539c643fab5a\") " pod="openshift-monitoring/metrics-server-bff94c98c-8ffzl" Apr 17 11:19:29.429513 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.429451 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/73a3d378-3878-418a-9279-539c643fab5a-metrics-server-audit-profiles\") pod \"metrics-server-bff94c98c-8ffzl\" (UID: \"73a3d378-3878-418a-9279-539c643fab5a\") " pod="openshift-monitoring/metrics-server-bff94c98c-8ffzl" Apr 17 11:19:29.429513 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.429483 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73a3d378-3878-418a-9279-539c643fab5a-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-bff94c98c-8ffzl\" (UID: \"73a3d378-3878-418a-9279-539c643fab5a\") " pod="openshift-monitoring/metrics-server-bff94c98c-8ffzl" Apr 17 11:19:29.429766 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.429516 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-578cj\" (UniqueName: \"kubernetes.io/projected/73a3d378-3878-418a-9279-539c643fab5a-kube-api-access-578cj\") pod \"metrics-server-bff94c98c-8ffzl\" (UID: \"73a3d378-3878-418a-9279-539c643fab5a\") " pod="openshift-monitoring/metrics-server-bff94c98c-8ffzl" Apr 17 11:19:29.429896 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.429798 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/73a3d378-3878-418a-9279-539c643fab5a-audit-log\") pod \"metrics-server-bff94c98c-8ffzl\" (UID: \"73a3d378-3878-418a-9279-539c643fab5a\") " pod="openshift-monitoring/metrics-server-bff94c98c-8ffzl" Apr 17 11:19:29.430318 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.430289 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73a3d378-3878-418a-9279-539c643fab5a-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-bff94c98c-8ffzl\" (UID: \"73a3d378-3878-418a-9279-539c643fab5a\") " pod="openshift-monitoring/metrics-server-bff94c98c-8ffzl" Apr 17 11:19:29.430626 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.430591 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/73a3d378-3878-418a-9279-539c643fab5a-metrics-server-audit-profiles\") pod \"metrics-server-bff94c98c-8ffzl\" (UID: \"73a3d378-3878-418a-9279-539c643fab5a\") " pod="openshift-monitoring/metrics-server-bff94c98c-8ffzl" Apr 17 11:19:29.432207 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.432154 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/73a3d378-3878-418a-9279-539c643fab5a-secret-metrics-server-client-certs\") pod \"metrics-server-bff94c98c-8ffzl\" (UID: \"73a3d378-3878-418a-9279-539c643fab5a\") " pod="openshift-monitoring/metrics-server-bff94c98c-8ffzl" Apr 17 11:19:29.432286 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.432255 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/73a3d378-3878-418a-9279-539c643fab5a-secret-metrics-server-tls\") pod \"metrics-server-bff94c98c-8ffzl\" (UID: \"73a3d378-3878-418a-9279-539c643fab5a\") " pod="openshift-monitoring/metrics-server-bff94c98c-8ffzl" Apr 17 11:19:29.432355 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.432335 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73a3d378-3878-418a-9279-539c643fab5a-client-ca-bundle\") pod \"metrics-server-bff94c98c-8ffzl\" (UID: \"73a3d378-3878-418a-9279-539c643fab5a\") " pod="openshift-monitoring/metrics-server-bff94c98c-8ffzl" Apr 17 11:19:29.438563 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.438541 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-578cj\" (UniqueName: \"kubernetes.io/projected/73a3d378-3878-418a-9279-539c643fab5a-kube-api-access-578cj\") pod \"metrics-server-bff94c98c-8ffzl\" (UID: \"73a3d378-3878-418a-9279-539c643fab5a\") " pod="openshift-monitoring/metrics-server-bff94c98c-8ffzl" Apr 17 11:19:29.523059 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.523013 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-bff94c98c-8ffzl" Apr 17 11:19:29.546096 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.546063 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-776fv"] Apr 17 11:19:29.551484 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.549555 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-776fv" Apr 17 11:19:29.553043 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.553018 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 17 11:19:29.553162 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.553020 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-kjv6j\"" Apr 17 11:19:29.557724 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.557631 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-776fv"] Apr 17 11:19:29.630751 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.630712 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4b416cfd-b1d8-4cfb-a5f3-9d2cf17e642b-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-776fv\" (UID: \"4b416cfd-b1d8-4cfb-a5f3-9d2cf17e642b\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-776fv" Apr 17 11:19:29.731598 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.731520 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4b416cfd-b1d8-4cfb-a5f3-9d2cf17e642b-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-776fv\" (UID: \"4b416cfd-b1d8-4cfb-a5f3-9d2cf17e642b\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-776fv" Apr 17 11:19:29.735184 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.735159 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4b416cfd-b1d8-4cfb-a5f3-9d2cf17e642b-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-776fv\" (UID: \"4b416cfd-b1d8-4cfb-a5f3-9d2cf17e642b\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-776fv" Apr 17 11:19:29.863303 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:29.863261 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-776fv" Apr 17 11:19:30.161606 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:30.161428 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-bff94c98c-8ffzl"] Apr 17 11:19:30.163603 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:19:30.163568 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73a3d378_3878_418a_9279_539c643fab5a.slice/crio-5133982440d71686b6074a0765f8f9a13388778c74ab8a11699c1a501c5238d4 WatchSource:0}: Error finding container 5133982440d71686b6074a0765f8f9a13388778c74ab8a11699c1a501c5238d4: Status 404 returned error can't find the container with id 5133982440d71686b6074a0765f8f9a13388778c74ab8a11699c1a501c5238d4 Apr 17 11:19:30.175543 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:30.175519 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-776fv"] Apr 17 11:19:30.178418 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:19:30.178369 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b416cfd_b1d8_4cfb_a5f3_9d2cf17e642b.slice/crio-d8b0ca1425d9d3c5972b3e06f3d8d336d52a5bce929eae999c051ddf68f8f8b4 WatchSource:0}: Error finding container d8b0ca1425d9d3c5972b3e06f3d8d336d52a5bce929eae999c051ddf68f8f8b4: Status 404 returned error can't find the container with id d8b0ca1425d9d3c5972b3e06f3d8d336d52a5bce929eae999c051ddf68f8f8b4 Apr 17 11:19:30.189570 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:30.189533 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-bff94c98c-8ffzl" event={"ID":"73a3d378-3878-418a-9279-539c643fab5a","Type":"ContainerStarted","Data":"5133982440d71686b6074a0765f8f9a13388778c74ab8a11699c1a501c5238d4"} Apr 17 11:19:30.190887 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:30.190860 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-776fv" event={"ID":"4b416cfd-b1d8-4cfb-a5f3-9d2cf17e642b","Type":"ContainerStarted","Data":"d8b0ca1425d9d3c5972b3e06f3d8d336d52a5bce929eae999c051ddf68f8f8b4"} Apr 17 11:19:30.603627 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:30.603546 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7886847858-kdzp6" Apr 17 11:19:31.199834 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:31.199769 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-dccb875d5-dgb9v" event={"ID":"4121857a-9f7b-48f9-83c8-b78509c1d47f","Type":"ContainerStarted","Data":"d4f39f6baf569e3d8dc2ca571a46934c656ef6d6f17d429d055704b811325c35"} Apr 17 11:19:31.200270 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:31.199845 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-dccb875d5-dgb9v" event={"ID":"4121857a-9f7b-48f9-83c8-b78509c1d47f","Type":"ContainerStarted","Data":"cda9dab6133682dc60b8705cf0109574f1c4f34e1752770e8b9a1cc65a536cab"} Apr 17 11:19:31.200270 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:31.199860 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-dccb875d5-dgb9v" event={"ID":"4121857a-9f7b-48f9-83c8-b78509c1d47f","Type":"ContainerStarted","Data":"22a6ad7e4f68e466d343cb4575d7b2028fe6aa2ad8dc7037404c6e35c21df600"} Apr 17 11:19:32.159410 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:32.159385 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-rpkz7" Apr 17 11:19:32.205224 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:32.205151 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-dccb875d5-dgb9v" event={"ID":"4121857a-9f7b-48f9-83c8-b78509c1d47f","Type":"ContainerStarted","Data":"6b3faee3a52f808acc3be7ce085a9cf50fb806cc0d27019404aef057cfd7cdbc"} Apr 17 11:19:32.205665 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:32.205633 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-dccb875d5-dgb9v" event={"ID":"4121857a-9f7b-48f9-83c8-b78509c1d47f","Type":"ContainerStarted","Data":"a287c6e236a32584b46d3c76af003a7e1b2d9961c0bdb97f21c9a728462c36d7"} Apr 17 11:19:32.205789 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:32.205680 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-dccb875d5-dgb9v" event={"ID":"4121857a-9f7b-48f9-83c8-b78509c1d47f","Type":"ContainerStarted","Data":"a07aa270fb2a9fa0c51b36073e7392fd249b8915e4f7d387b52e86ecaf498f82"} Apr 17 11:19:32.206214 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:32.206193 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-dccb875d5-dgb9v" Apr 17 11:19:32.207496 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:32.207162 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-bff94c98c-8ffzl" event={"ID":"73a3d378-3878-418a-9279-539c643fab5a","Type":"ContainerStarted","Data":"e6ed835ea8f991b67d9692a723bc51351948a9b0131924dffb4156e9f8818642"} Apr 17 11:19:32.209025 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:32.208701 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-776fv" event={"ID":"4b416cfd-b1d8-4cfb-a5f3-9d2cf17e642b","Type":"ContainerStarted","Data":"51a6ac1a25bcaa50ebaf3a71b0951b43e07250e801283092002b685197f49fe3"} Apr 17 11:19:32.209652 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:32.209631 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-776fv" Apr 17 11:19:32.215013 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:32.214992 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-776fv" Apr 17 11:19:32.228414 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:32.228369 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-dccb875d5-dgb9v" podStartSLOduration=1.589574545 podStartE2EDuration="5.228354277s" podCreationTimestamp="2026-04-17 11:19:27 +0000 UTC" firstStartedPulling="2026-04-17 11:19:28.344105907 +0000 UTC m=+177.286260131" lastFinishedPulling="2026-04-17 11:19:31.982885628 +0000 UTC m=+180.925039863" observedRunningTime="2026-04-17 11:19:32.228206098 +0000 UTC m=+181.170360362" watchObservedRunningTime="2026-04-17 11:19:32.228354277 +0000 UTC m=+181.170508564" Apr 17 11:19:32.244149 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:32.244106 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-776fv" podStartSLOduration=1.441597931 podStartE2EDuration="3.244091953s" podCreationTimestamp="2026-04-17 11:19:29 +0000 UTC" firstStartedPulling="2026-04-17 11:19:30.180577453 +0000 UTC m=+179.122731678" lastFinishedPulling="2026-04-17 11:19:31.983071465 +0000 UTC m=+180.925225700" observedRunningTime="2026-04-17 11:19:32.243023259 +0000 UTC m=+181.185177504" watchObservedRunningTime="2026-04-17 11:19:32.244091953 +0000 UTC m=+181.186246237" Apr 17 11:19:32.264309 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:32.264246 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-bff94c98c-8ffzl" podStartSLOduration=1.443845663 podStartE2EDuration="3.264225027s" podCreationTimestamp="2026-04-17 11:19:29 +0000 UTC" firstStartedPulling="2026-04-17 11:19:30.165780257 +0000 UTC m=+179.107934476" lastFinishedPulling="2026-04-17 11:19:31.986159602 +0000 UTC m=+180.928313840" observedRunningTime="2026-04-17 11:19:32.263368936 +0000 UTC m=+181.205523179" watchObservedRunningTime="2026-04-17 11:19:32.264225027 +0000 UTC m=+181.206379308" Apr 17 11:19:35.619609 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:35.619545 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7886847858-kdzp6" podUID="e812ce5b-8977-4351-b49f-0cbf7496a798" containerName="registry" containerID="cri-o://06d76ea6314ec8e526e7f55535c0526a7ea21dd97ddfa0ba1706b8e6bcc7a0b7" gracePeriod=30 Apr 17 11:19:35.891831 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:35.891790 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7886847858-kdzp6" Apr 17 11:19:35.990992 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:35.990113 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e812ce5b-8977-4351-b49f-0cbf7496a798-trusted-ca\") pod \"e812ce5b-8977-4351-b49f-0cbf7496a798\" (UID: \"e812ce5b-8977-4351-b49f-0cbf7496a798\") " Apr 17 11:19:35.990992 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:35.990165 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e812ce5b-8977-4351-b49f-0cbf7496a798-image-registry-private-configuration\") pod \"e812ce5b-8977-4351-b49f-0cbf7496a798\" (UID: \"e812ce5b-8977-4351-b49f-0cbf7496a798\") " Apr 17 11:19:35.990992 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:35.990199 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e812ce5b-8977-4351-b49f-0cbf7496a798-ca-trust-extracted\") pod \"e812ce5b-8977-4351-b49f-0cbf7496a798\" (UID: \"e812ce5b-8977-4351-b49f-0cbf7496a798\") " Apr 17 11:19:35.990992 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:35.990238 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e812ce5b-8977-4351-b49f-0cbf7496a798-installation-pull-secrets\") pod \"e812ce5b-8977-4351-b49f-0cbf7496a798\" (UID: \"e812ce5b-8977-4351-b49f-0cbf7496a798\") " Apr 17 11:19:35.990992 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:35.990276 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e812ce5b-8977-4351-b49f-0cbf7496a798-registry-certificates\") pod \"e812ce5b-8977-4351-b49f-0cbf7496a798\" (UID: \"e812ce5b-8977-4351-b49f-0cbf7496a798\") " Apr 17 11:19:35.990992 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:35.990314 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lf9h7\" (UniqueName: \"kubernetes.io/projected/e812ce5b-8977-4351-b49f-0cbf7496a798-kube-api-access-lf9h7\") pod \"e812ce5b-8977-4351-b49f-0cbf7496a798\" (UID: \"e812ce5b-8977-4351-b49f-0cbf7496a798\") " Apr 17 11:19:35.990992 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:35.990349 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e812ce5b-8977-4351-b49f-0cbf7496a798-registry-tls\") pod \"e812ce5b-8977-4351-b49f-0cbf7496a798\" (UID: \"e812ce5b-8977-4351-b49f-0cbf7496a798\") " Apr 17 11:19:35.990992 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:35.990403 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e812ce5b-8977-4351-b49f-0cbf7496a798-bound-sa-token\") pod \"e812ce5b-8977-4351-b49f-0cbf7496a798\" (UID: \"e812ce5b-8977-4351-b49f-0cbf7496a798\") " Apr 17 11:19:35.991615 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:35.991573 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e812ce5b-8977-4351-b49f-0cbf7496a798-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e812ce5b-8977-4351-b49f-0cbf7496a798" (UID: "e812ce5b-8977-4351-b49f-0cbf7496a798"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:19:35.992031 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:35.991991 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e812ce5b-8977-4351-b49f-0cbf7496a798-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e812ce5b-8977-4351-b49f-0cbf7496a798" (UID: "e812ce5b-8977-4351-b49f-0cbf7496a798"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:19:35.993768 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:35.993638 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e812ce5b-8977-4351-b49f-0cbf7496a798-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e812ce5b-8977-4351-b49f-0cbf7496a798" (UID: "e812ce5b-8977-4351-b49f-0cbf7496a798"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:19:35.993887 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:35.993780 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e812ce5b-8977-4351-b49f-0cbf7496a798-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "e812ce5b-8977-4351-b49f-0cbf7496a798" (UID: "e812ce5b-8977-4351-b49f-0cbf7496a798"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:19:35.993887 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:35.993803 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e812ce5b-8977-4351-b49f-0cbf7496a798-kube-api-access-lf9h7" (OuterVolumeSpecName: "kube-api-access-lf9h7") pod "e812ce5b-8977-4351-b49f-0cbf7496a798" (UID: "e812ce5b-8977-4351-b49f-0cbf7496a798"). InnerVolumeSpecName "kube-api-access-lf9h7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:19:35.993887 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:35.993852 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e812ce5b-8977-4351-b49f-0cbf7496a798-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e812ce5b-8977-4351-b49f-0cbf7496a798" (UID: "e812ce5b-8977-4351-b49f-0cbf7496a798"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:19:35.994062 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:35.993944 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e812ce5b-8977-4351-b49f-0cbf7496a798-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "e812ce5b-8977-4351-b49f-0cbf7496a798" (UID: "e812ce5b-8977-4351-b49f-0cbf7496a798"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:19:36.002308 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:36.002275 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e812ce5b-8977-4351-b49f-0cbf7496a798-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e812ce5b-8977-4351-b49f-0cbf7496a798" (UID: "e812ce5b-8977-4351-b49f-0cbf7496a798"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:19:36.091138 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:36.091103 2581 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e812ce5b-8977-4351-b49f-0cbf7496a798-trusted-ca\") on node \"ip-10-0-136-177.ec2.internal\" DevicePath \"\"" Apr 17 11:19:36.091138 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:36.091136 2581 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e812ce5b-8977-4351-b49f-0cbf7496a798-image-registry-private-configuration\") on node \"ip-10-0-136-177.ec2.internal\" DevicePath \"\"" Apr 17 11:19:36.091138 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:36.091148 2581 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e812ce5b-8977-4351-b49f-0cbf7496a798-ca-trust-extracted\") on node \"ip-10-0-136-177.ec2.internal\" DevicePath \"\"" Apr 17 11:19:36.091361 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:36.091158 2581 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e812ce5b-8977-4351-b49f-0cbf7496a798-installation-pull-secrets\") on node \"ip-10-0-136-177.ec2.internal\" DevicePath \"\"" Apr 17 11:19:36.091361 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:36.091167 2581 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e812ce5b-8977-4351-b49f-0cbf7496a798-registry-certificates\") on node \"ip-10-0-136-177.ec2.internal\" DevicePath \"\"" Apr 17 11:19:36.091361 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:36.091177 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lf9h7\" (UniqueName: \"kubernetes.io/projected/e812ce5b-8977-4351-b49f-0cbf7496a798-kube-api-access-lf9h7\") on node \"ip-10-0-136-177.ec2.internal\" DevicePath \"\"" Apr 17 11:19:36.091361 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:36.091186 2581 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e812ce5b-8977-4351-b49f-0cbf7496a798-registry-tls\") on node \"ip-10-0-136-177.ec2.internal\" DevicePath \"\"" Apr 17 11:19:36.091361 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:36.091195 2581 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e812ce5b-8977-4351-b49f-0cbf7496a798-bound-sa-token\") on node \"ip-10-0-136-177.ec2.internal\" DevicePath \"\"" Apr 17 11:19:36.222330 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:36.222246 2581 generic.go:358] "Generic (PLEG): container finished" podID="e812ce5b-8977-4351-b49f-0cbf7496a798" containerID="06d76ea6314ec8e526e7f55535c0526a7ea21dd97ddfa0ba1706b8e6bcc7a0b7" exitCode=0 Apr 17 11:19:36.222330 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:36.222301 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7886847858-kdzp6" event={"ID":"e812ce5b-8977-4351-b49f-0cbf7496a798","Type":"ContainerDied","Data":"06d76ea6314ec8e526e7f55535c0526a7ea21dd97ddfa0ba1706b8e6bcc7a0b7"} Apr 17 11:19:36.222330 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:36.222305 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7886847858-kdzp6" Apr 17 11:19:36.222330 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:36.222328 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7886847858-kdzp6" event={"ID":"e812ce5b-8977-4351-b49f-0cbf7496a798","Type":"ContainerDied","Data":"8170b217f19348df6f13700bd7ce319615326ad14878a638af4efa933a8a1024"} Apr 17 11:19:36.222613 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:36.222343 2581 scope.go:117] "RemoveContainer" containerID="06d76ea6314ec8e526e7f55535c0526a7ea21dd97ddfa0ba1706b8e6bcc7a0b7" Apr 17 11:19:36.230447 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:36.230427 2581 scope.go:117] "RemoveContainer" containerID="06d76ea6314ec8e526e7f55535c0526a7ea21dd97ddfa0ba1706b8e6bcc7a0b7" Apr 17 11:19:36.230718 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:19:36.230694 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06d76ea6314ec8e526e7f55535c0526a7ea21dd97ddfa0ba1706b8e6bcc7a0b7\": container with ID starting with 06d76ea6314ec8e526e7f55535c0526a7ea21dd97ddfa0ba1706b8e6bcc7a0b7 not found: ID does not exist" containerID="06d76ea6314ec8e526e7f55535c0526a7ea21dd97ddfa0ba1706b8e6bcc7a0b7" Apr 17 11:19:36.230782 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:36.230723 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06d76ea6314ec8e526e7f55535c0526a7ea21dd97ddfa0ba1706b8e6bcc7a0b7"} err="failed to get container status \"06d76ea6314ec8e526e7f55535c0526a7ea21dd97ddfa0ba1706b8e6bcc7a0b7\": rpc error: code = NotFound desc = could not find container \"06d76ea6314ec8e526e7f55535c0526a7ea21dd97ddfa0ba1706b8e6bcc7a0b7\": container with ID starting with 06d76ea6314ec8e526e7f55535c0526a7ea21dd97ddfa0ba1706b8e6bcc7a0b7 not found: ID does not exist" Apr 17 11:19:36.244712 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:36.244681 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7886847858-kdzp6"] Apr 17 11:19:36.248673 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:36.248646 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7886847858-kdzp6"] Apr 17 11:19:37.660859 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:37.660798 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e812ce5b-8977-4351-b49f-0cbf7496a798" path="/var/lib/kubelet/pods/e812ce5b-8977-4351-b49f-0cbf7496a798/volumes" Apr 17 11:19:38.221211 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:38.221177 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-dccb875d5-dgb9v" Apr 17 11:19:49.523732 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:49.523701 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-bff94c98c-8ffzl" Apr 17 11:19:49.524133 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:19:49.523746 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-bff94c98c-8ffzl" Apr 17 11:20:05.980436 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:20:05.980403 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-bff94c98c-8ffzl_73a3d378-3878-418a-9279-539c643fab5a/metrics-server/0.log" Apr 17 11:20:06.182625 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:20:06.182595 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-776fv_4b416cfd-b1d8-4cfb-a5f3-9d2cf17e642b/monitoring-plugin/0.log" Apr 17 11:20:06.988747 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:20:06.988716 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lwtkb_33907f69-05ed-46d3-a4c7-b24d3165e1e7/init-textfile/0.log" Apr 17 11:20:07.182105 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:20:07.182059 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lwtkb_33907f69-05ed-46d3-a4c7-b24d3165e1e7/node-exporter/0.log" Apr 17 11:20:07.383256 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:20:07.383225 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lwtkb_33907f69-05ed-46d3-a4c7-b24d3165e1e7/kube-rbac-proxy/0.log" Apr 17 11:20:09.528968 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:20:09.528902 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-bff94c98c-8ffzl" Apr 17 11:20:09.532715 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:20:09.532689 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-bff94c98c-8ffzl" Apr 17 11:20:10.183344 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:20:10.183304 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-d6d28_aa810299-6a4d-4ef5-b2b3-abe18f37385c/prometheus-operator/0.log" Apr 17 11:20:10.383923 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:20:10.383895 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-d6d28_aa810299-6a4d-4ef5-b2b3-abe18f37385c/kube-rbac-proxy/0.log" Apr 17 11:20:10.782530 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:20:10.782499 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-dccb875d5-dgb9v_4121857a-9f7b-48f9-83c8-b78509c1d47f/thanos-query/0.log" Apr 17 11:20:10.988863 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:20:10.988812 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-dccb875d5-dgb9v_4121857a-9f7b-48f9-83c8-b78509c1d47f/kube-rbac-proxy-web/0.log" Apr 17 11:20:11.182033 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:20:11.182002 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-dccb875d5-dgb9v_4121857a-9f7b-48f9-83c8-b78509c1d47f/kube-rbac-proxy/0.log" Apr 17 11:20:11.382304 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:20:11.382277 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-dccb875d5-dgb9v_4121857a-9f7b-48f9-83c8-b78509c1d47f/prom-label-proxy/0.log" Apr 17 11:20:11.582568 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:20:11.582502 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-dccb875d5-dgb9v_4121857a-9f7b-48f9-83c8-b78509c1d47f/kube-rbac-proxy-rules/0.log" Apr 17 11:20:11.783477 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:20:11.783447 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-dccb875d5-dgb9v_4121857a-9f7b-48f9-83c8-b78509c1d47f/kube-rbac-proxy-metrics/0.log" Apr 17 11:20:11.981284 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:20:11.981258 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-zdr26_76699d2b-150e-4629-9c16-03548712a64f/networking-console-plugin/0.log" Apr 17 11:20:43.339198 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:20:43.339160 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bee0bc88-7732-4010-9886-3df7384bf1c8-metrics-certs\") pod \"network-metrics-daemon-9k5nh\" (UID: \"bee0bc88-7732-4010-9886-3df7384bf1c8\") " pod="openshift-multus/network-metrics-daemon-9k5nh" Apr 17 11:20:43.341406 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:20:43.341387 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bee0bc88-7732-4010-9886-3df7384bf1c8-metrics-certs\") pod \"network-metrics-daemon-9k5nh\" (UID: \"bee0bc88-7732-4010-9886-3df7384bf1c8\") " pod="openshift-multus/network-metrics-daemon-9k5nh" Apr 17 11:20:43.561021 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:20:43.560990 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ntppc\"" Apr 17 11:20:43.568706 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:20:43.568688 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9k5nh" Apr 17 11:20:43.687835 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:20:43.687790 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9k5nh"] Apr 17 11:20:43.690808 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:20:43.690779 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbee0bc88_7732_4010_9886_3df7384bf1c8.slice/crio-903bc80101d197f1d80765940c61df93cc8d09fafa43df8207d5cd6a10706fb7 WatchSource:0}: Error finding container 903bc80101d197f1d80765940c61df93cc8d09fafa43df8207d5cd6a10706fb7: Status 404 returned error can't find the container with id 903bc80101d197f1d80765940c61df93cc8d09fafa43df8207d5cd6a10706fb7 Apr 17 11:20:44.407852 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:20:44.407797 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9k5nh" event={"ID":"bee0bc88-7732-4010-9886-3df7384bf1c8","Type":"ContainerStarted","Data":"903bc80101d197f1d80765940c61df93cc8d09fafa43df8207d5cd6a10706fb7"} Apr 17 11:20:45.412368 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:20:45.412326 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9k5nh" event={"ID":"bee0bc88-7732-4010-9886-3df7384bf1c8","Type":"ContainerStarted","Data":"ad404a079c1d10685ae580e0eb934419e468af4a7ce30c6c911d9f1e79a0e625"} Apr 17 11:20:45.412368 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:20:45.412367 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9k5nh" event={"ID":"bee0bc88-7732-4010-9886-3df7384bf1c8","Type":"ContainerStarted","Data":"5801f676f7078583d2c4421a7105d5c2d542ea9fe5d27be01e368647b203c2cc"} Apr 17 11:21:31.511047 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:21:31.511021 2581 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 11:23:26.496597 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:23:26.496537 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-9k5nh" podStartSLOduration=414.42545034 podStartE2EDuration="6m55.496519564s" podCreationTimestamp="2026-04-17 11:16:31 +0000 UTC" firstStartedPulling="2026-04-17 11:20:43.692897964 +0000 UTC m=+252.635052184" lastFinishedPulling="2026-04-17 11:20:44.763967185 +0000 UTC m=+253.706121408" observedRunningTime="2026-04-17 11:20:45.431035124 +0000 UTC m=+254.373189364" watchObservedRunningTime="2026-04-17 11:23:26.496519564 +0000 UTC m=+415.438673806" Apr 17 11:23:26.497064 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:23:26.496693 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-kgw4l"] Apr 17 11:23:26.497064 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:23:26.496985 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e812ce5b-8977-4351-b49f-0cbf7496a798" containerName="registry" Apr 17 11:23:26.497064 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:23:26.497008 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="e812ce5b-8977-4351-b49f-0cbf7496a798" containerName="registry" Apr 17 11:23:26.497182 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:23:26.497097 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="e812ce5b-8977-4351-b49f-0cbf7496a798" containerName="registry" Apr 17 11:23:26.499942 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:23:26.499924 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-kgw4l" Apr 17 11:23:26.503547 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:23:26.503528 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 17 11:23:26.503769 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:23:26.503754 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 17 11:23:26.503849 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:23:26.503781 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-scd7v\"" Apr 17 11:23:26.503849 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:23:26.503756 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 17 11:23:26.509336 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:23:26.509313 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-kgw4l"] Apr 17 11:23:26.561265 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:23:26.561228 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w88x\" (UniqueName: \"kubernetes.io/projected/3905fd80-7c9d-4e58-a7b6-15db563b3b9e-kube-api-access-2w88x\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-kgw4l\" (UID: \"3905fd80-7c9d-4e58-a7b6-15db563b3b9e\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-kgw4l" Apr 17 11:23:26.561265 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:23:26.561266 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/3905fd80-7c9d-4e58-a7b6-15db563b3b9e-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-kgw4l\" (UID: \"3905fd80-7c9d-4e58-a7b6-15db563b3b9e\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-kgw4l" Apr 17 11:23:26.661924 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:23:26.661891 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2w88x\" (UniqueName: \"kubernetes.io/projected/3905fd80-7c9d-4e58-a7b6-15db563b3b9e-kube-api-access-2w88x\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-kgw4l\" (UID: \"3905fd80-7c9d-4e58-a7b6-15db563b3b9e\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-kgw4l" Apr 17 11:23:26.661924 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:23:26.661926 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/3905fd80-7c9d-4e58-a7b6-15db563b3b9e-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-kgw4l\" (UID: \"3905fd80-7c9d-4e58-a7b6-15db563b3b9e\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-kgw4l" Apr 17 11:23:26.664293 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:23:26.664269 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/3905fd80-7c9d-4e58-a7b6-15db563b3b9e-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-kgw4l\" (UID: \"3905fd80-7c9d-4e58-a7b6-15db563b3b9e\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-kgw4l" Apr 17 11:23:26.671343 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:23:26.671324 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w88x\" (UniqueName: \"kubernetes.io/projected/3905fd80-7c9d-4e58-a7b6-15db563b3b9e-kube-api-access-2w88x\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-kgw4l\" (UID: \"3905fd80-7c9d-4e58-a7b6-15db563b3b9e\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-kgw4l" Apr 17 11:23:26.810181 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:23:26.810094 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-kgw4l" Apr 17 11:23:26.933255 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:23:26.933232 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-kgw4l"] Apr 17 11:23:26.935996 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:23:26.935968 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3905fd80_7c9d_4e58_a7b6_15db563b3b9e.slice/crio-9a11e2ea41a4d95a31ce0a73077592d9f8f25f90ab94226c84fcbff016d17f2d WatchSource:0}: Error finding container 9a11e2ea41a4d95a31ce0a73077592d9f8f25f90ab94226c84fcbff016d17f2d: Status 404 returned error can't find the container with id 9a11e2ea41a4d95a31ce0a73077592d9f8f25f90ab94226c84fcbff016d17f2d Apr 17 11:23:26.937626 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:23:26.937608 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 11:23:27.841687 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:23:27.841651 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-kgw4l" event={"ID":"3905fd80-7c9d-4e58-a7b6-15db563b3b9e","Type":"ContainerStarted","Data":"9a11e2ea41a4d95a31ce0a73077592d9f8f25f90ab94226c84fcbff016d17f2d"} Apr 17 11:23:30.854647 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:23:30.854545 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-kgw4l" event={"ID":"3905fd80-7c9d-4e58-a7b6-15db563b3b9e","Type":"ContainerStarted","Data":"578e917d8f27822e83f3bc0b5364080666cea957be00b8985b7127f8523d57a9"} Apr 17 11:23:30.854647 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:23:30.854632 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-kgw4l" Apr 17 11:23:30.879285 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:23:30.879234 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-kgw4l" podStartSLOduration=1.33168623 podStartE2EDuration="4.879217824s" podCreationTimestamp="2026-04-17 11:23:26 +0000 UTC" firstStartedPulling="2026-04-17 11:23:26.937731772 +0000 UTC m=+415.879885991" lastFinishedPulling="2026-04-17 11:23:30.485263354 +0000 UTC m=+419.427417585" observedRunningTime="2026-04-17 11:23:30.877401044 +0000 UTC m=+419.819555303" watchObservedRunningTime="2026-04-17 11:23:30.879217824 +0000 UTC m=+419.821372064" Apr 17 11:23:51.861141 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:23:51.861065 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-kgw4l" Apr 17 11:24:40.063746 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:24:40.063715 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-dpx74"] Apr 17 11:24:40.067376 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:24:40.067347 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dpx74" Apr 17 11:24:40.072972 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:24:40.072948 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 11:24:40.073319 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:24:40.073289 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 11:24:40.073433 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:24:40.073358 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 17 11:24:40.073697 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:24:40.073581 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-fltfk\"" Apr 17 11:24:40.086193 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:24:40.086169 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-dpx74"] Apr 17 11:24:40.101489 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:24:40.101465 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-6zs2d"] Apr 17 11:24:40.104630 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:24:40.104615 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-6zs2d" Apr 17 11:24:40.107325 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:24:40.107304 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 17 11:24:40.107554 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:24:40.107539 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-74mq2\"" Apr 17 11:24:40.118313 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:24:40.118292 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-6zs2d"] Apr 17 11:24:40.129285 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:24:40.129257 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/41b5af85-5a6d-4a53-a59d-2fddbd2b8d95-data\") pod \"seaweedfs-86cc847c5c-6zs2d\" (UID: \"41b5af85-5a6d-4a53-a59d-2fddbd2b8d95\") " pod="kserve/seaweedfs-86cc847c5c-6zs2d" Apr 17 11:24:40.129405 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:24:40.129312 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f270178e-46a1-4abc-a804-f86a39b8dd51-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-dpx74\" (UID: \"f270178e-46a1-4abc-a804-f86a39b8dd51\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-dpx74" Apr 17 11:24:40.129405 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:24:40.129335 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpwqp\" (UniqueName: \"kubernetes.io/projected/f270178e-46a1-4abc-a804-f86a39b8dd51-kube-api-access-fpwqp\") pod \"llmisvc-controller-manager-68cc5db7c4-dpx74\" (UID: \"f270178e-46a1-4abc-a804-f86a39b8dd51\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-dpx74" Apr 17 11:24:40.129405 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:24:40.129360 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9w5c\" (UniqueName: \"kubernetes.io/projected/41b5af85-5a6d-4a53-a59d-2fddbd2b8d95-kube-api-access-q9w5c\") pod \"seaweedfs-86cc847c5c-6zs2d\" (UID: \"41b5af85-5a6d-4a53-a59d-2fddbd2b8d95\") " pod="kserve/seaweedfs-86cc847c5c-6zs2d" Apr 17 11:24:40.230283 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:24:40.230251 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f270178e-46a1-4abc-a804-f86a39b8dd51-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-dpx74\" (UID: \"f270178e-46a1-4abc-a804-f86a39b8dd51\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-dpx74" Apr 17 11:24:40.230283 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:24:40.230284 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fpwqp\" (UniqueName: \"kubernetes.io/projected/f270178e-46a1-4abc-a804-f86a39b8dd51-kube-api-access-fpwqp\") pod \"llmisvc-controller-manager-68cc5db7c4-dpx74\" (UID: \"f270178e-46a1-4abc-a804-f86a39b8dd51\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-dpx74" Apr 17 11:24:40.230484 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:24:40.230305 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q9w5c\" (UniqueName: \"kubernetes.io/projected/41b5af85-5a6d-4a53-a59d-2fddbd2b8d95-kube-api-access-q9w5c\") pod \"seaweedfs-86cc847c5c-6zs2d\" (UID: \"41b5af85-5a6d-4a53-a59d-2fddbd2b8d95\") " pod="kserve/seaweedfs-86cc847c5c-6zs2d" Apr 17 11:24:40.230484 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:24:40.230349 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/41b5af85-5a6d-4a53-a59d-2fddbd2b8d95-data\") pod \"seaweedfs-86cc847c5c-6zs2d\" (UID: \"41b5af85-5a6d-4a53-a59d-2fddbd2b8d95\") " pod="kserve/seaweedfs-86cc847c5c-6zs2d" Apr 17 11:24:40.230722 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:24:40.230705 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/41b5af85-5a6d-4a53-a59d-2fddbd2b8d95-data\") pod \"seaweedfs-86cc847c5c-6zs2d\" (UID: \"41b5af85-5a6d-4a53-a59d-2fddbd2b8d95\") " pod="kserve/seaweedfs-86cc847c5c-6zs2d" Apr 17 11:24:40.232692 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:24:40.232675 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f270178e-46a1-4abc-a804-f86a39b8dd51-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-dpx74\" (UID: \"f270178e-46a1-4abc-a804-f86a39b8dd51\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-dpx74" Apr 17 11:24:40.244903 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:24:40.244866 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9w5c\" (UniqueName: \"kubernetes.io/projected/41b5af85-5a6d-4a53-a59d-2fddbd2b8d95-kube-api-access-q9w5c\") pod \"seaweedfs-86cc847c5c-6zs2d\" (UID: \"41b5af85-5a6d-4a53-a59d-2fddbd2b8d95\") " pod="kserve/seaweedfs-86cc847c5c-6zs2d" Apr 17 11:24:40.245023 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:24:40.244875 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpwqp\" (UniqueName: \"kubernetes.io/projected/f270178e-46a1-4abc-a804-f86a39b8dd51-kube-api-access-fpwqp\") pod \"llmisvc-controller-manager-68cc5db7c4-dpx74\" (UID: \"f270178e-46a1-4abc-a804-f86a39b8dd51\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-dpx74" Apr 17 11:24:40.380589 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:24:40.380554 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dpx74" Apr 17 11:24:40.414515 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:24:40.414483 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-6zs2d" Apr 17 11:24:40.543202 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:24:40.543167 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-dpx74"] Apr 17 11:24:40.547039 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:24:40.547010 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf270178e_46a1_4abc_a804_f86a39b8dd51.slice/crio-a860c6c60ec62f31d909d455026ccffe21b3d08b1df15ed4f3ba9c601309c505 WatchSource:0}: Error finding container a860c6c60ec62f31d909d455026ccffe21b3d08b1df15ed4f3ba9c601309c505: Status 404 returned error can't find the container with id a860c6c60ec62f31d909d455026ccffe21b3d08b1df15ed4f3ba9c601309c505 Apr 17 11:24:40.577198 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:24:40.577174 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-6zs2d"] Apr 17 11:24:40.579501 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:24:40.579478 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41b5af85_5a6d_4a53_a59d_2fddbd2b8d95.slice/crio-4ce9efa2057d3130286b7fea440228995d2fb9fb31561f2eec8914d2aceb08af WatchSource:0}: Error finding container 4ce9efa2057d3130286b7fea440228995d2fb9fb31561f2eec8914d2aceb08af: Status 404 returned error can't find the container with id 4ce9efa2057d3130286b7fea440228995d2fb9fb31561f2eec8914d2aceb08af Apr 17 11:24:41.052340 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:24:41.052059 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dpx74" event={"ID":"f270178e-46a1-4abc-a804-f86a39b8dd51","Type":"ContainerStarted","Data":"a860c6c60ec62f31d909d455026ccffe21b3d08b1df15ed4f3ba9c601309c505"} Apr 17 11:24:41.054352 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:24:41.054306 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-6zs2d" event={"ID":"41b5af85-5a6d-4a53-a59d-2fddbd2b8d95","Type":"ContainerStarted","Data":"4ce9efa2057d3130286b7fea440228995d2fb9fb31561f2eec8914d2aceb08af"} Apr 17 11:24:44.066191 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:24:44.066151 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-6zs2d" event={"ID":"41b5af85-5a6d-4a53-a59d-2fddbd2b8d95","Type":"ContainerStarted","Data":"a643872417312403e99fa474fef494697fe0a54d700776d1dc9155fe2b669f0b"} Apr 17 11:24:44.066630 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:24:44.066401 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-6zs2d" Apr 17 11:24:44.067373 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:24:44.067349 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dpx74" event={"ID":"f270178e-46a1-4abc-a804-f86a39b8dd51","Type":"ContainerStarted","Data":"98524ce371fb28816c460d4edd643cc306b842b9d774c5c16de2292e536be3f7"} Apr 17 11:24:44.067505 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:24:44.067489 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dpx74" Apr 17 11:24:44.090037 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:24:44.089985 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-6zs2d" podStartSLOduration=0.828902043 podStartE2EDuration="4.089969454s" podCreationTimestamp="2026-04-17 11:24:40 +0000 UTC" firstStartedPulling="2026-04-17 11:24:40.580813923 +0000 UTC m=+489.522968142" lastFinishedPulling="2026-04-17 11:24:43.841881334 +0000 UTC m=+492.784035553" observedRunningTime="2026-04-17 11:24:44.088275659 +0000 UTC m=+493.030429911" watchObservedRunningTime="2026-04-17 11:24:44.089969454 +0000 UTC m=+493.032123695" Apr 17 11:24:44.111052 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:24:44.110069 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dpx74" podStartSLOduration=0.869483456 podStartE2EDuration="4.110051595s" podCreationTimestamp="2026-04-17 11:24:40 +0000 UTC" firstStartedPulling="2026-04-17 11:24:40.548275355 +0000 UTC m=+489.490429574" lastFinishedPulling="2026-04-17 11:24:43.788843491 +0000 UTC m=+492.730997713" observedRunningTime="2026-04-17 11:24:44.10863312 +0000 UTC m=+493.050787361" watchObservedRunningTime="2026-04-17 11:24:44.110051595 +0000 UTC m=+493.052205836" Apr 17 11:24:50.072513 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:24:50.072481 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-6zs2d" Apr 17 11:25:15.073020 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:25:15.072990 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dpx74" Apr 17 11:25:50.041272 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:25:50.041189 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-9hwhx"] Apr 17 11:25:50.044369 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:25:50.044352 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-9hwhx" Apr 17 11:25:50.047237 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:25:50.047212 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-ssxsk\"" Apr 17 11:25:50.047365 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:25:50.047337 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 17 11:25:50.057541 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:25:50.057516 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-9hwhx"] Apr 17 11:25:50.083085 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:25:50.083049 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/752a2175-fdc0-4bc2-9946-3f222fd602f6-cert\") pod \"odh-model-controller-696fc77849-9hwhx\" (UID: \"752a2175-fdc0-4bc2-9946-3f222fd602f6\") " pod="kserve/odh-model-controller-696fc77849-9hwhx" Apr 17 11:25:50.083213 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:25:50.083089 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q877c\" (UniqueName: \"kubernetes.io/projected/752a2175-fdc0-4bc2-9946-3f222fd602f6-kube-api-access-q877c\") pod \"odh-model-controller-696fc77849-9hwhx\" (UID: \"752a2175-fdc0-4bc2-9946-3f222fd602f6\") " pod="kserve/odh-model-controller-696fc77849-9hwhx" Apr 17 11:25:50.183743 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:25:50.183708 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/752a2175-fdc0-4bc2-9946-3f222fd602f6-cert\") pod \"odh-model-controller-696fc77849-9hwhx\" (UID: \"752a2175-fdc0-4bc2-9946-3f222fd602f6\") " pod="kserve/odh-model-controller-696fc77849-9hwhx" Apr 17 11:25:50.183743 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:25:50.183747 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q877c\" (UniqueName: \"kubernetes.io/projected/752a2175-fdc0-4bc2-9946-3f222fd602f6-kube-api-access-q877c\") pod \"odh-model-controller-696fc77849-9hwhx\" (UID: \"752a2175-fdc0-4bc2-9946-3f222fd602f6\") " pod="kserve/odh-model-controller-696fc77849-9hwhx" Apr 17 11:25:50.183984 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:25:50.183861 2581 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 17 11:25:50.183984 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:25:50.183944 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/752a2175-fdc0-4bc2-9946-3f222fd602f6-cert podName:752a2175-fdc0-4bc2-9946-3f222fd602f6 nodeName:}" failed. No retries permitted until 2026-04-17 11:25:50.683929386 +0000 UTC m=+559.626083610 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/752a2175-fdc0-4bc2-9946-3f222fd602f6-cert") pod "odh-model-controller-696fc77849-9hwhx" (UID: "752a2175-fdc0-4bc2-9946-3f222fd602f6") : secret "odh-model-controller-webhook-cert" not found Apr 17 11:25:50.197675 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:25:50.197651 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q877c\" (UniqueName: \"kubernetes.io/projected/752a2175-fdc0-4bc2-9946-3f222fd602f6-kube-api-access-q877c\") pod \"odh-model-controller-696fc77849-9hwhx\" (UID: \"752a2175-fdc0-4bc2-9946-3f222fd602f6\") " pod="kserve/odh-model-controller-696fc77849-9hwhx" Apr 17 11:25:50.688095 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:25:50.688050 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/752a2175-fdc0-4bc2-9946-3f222fd602f6-cert\") pod \"odh-model-controller-696fc77849-9hwhx\" (UID: \"752a2175-fdc0-4bc2-9946-3f222fd602f6\") " pod="kserve/odh-model-controller-696fc77849-9hwhx" Apr 17 11:25:50.690446 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:25:50.690423 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/752a2175-fdc0-4bc2-9946-3f222fd602f6-cert\") pod \"odh-model-controller-696fc77849-9hwhx\" (UID: \"752a2175-fdc0-4bc2-9946-3f222fd602f6\") " pod="kserve/odh-model-controller-696fc77849-9hwhx" Apr 17 11:25:50.954455 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:25:50.954376 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-9hwhx" Apr 17 11:25:51.082718 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:25:51.082688 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-9hwhx"] Apr 17 11:25:51.085676 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:25:51.085646 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod752a2175_fdc0_4bc2_9946_3f222fd602f6.slice/crio-b486faa2232878d8b6d40c031ca868064df082dc8ed30806988a98f5d8c2a120 WatchSource:0}: Error finding container b486faa2232878d8b6d40c031ca868064df082dc8ed30806988a98f5d8c2a120: Status 404 returned error can't find the container with id b486faa2232878d8b6d40c031ca868064df082dc8ed30806988a98f5d8c2a120 Apr 17 11:25:51.280641 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:25:51.280556 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-9hwhx" event={"ID":"752a2175-fdc0-4bc2-9946-3f222fd602f6","Type":"ContainerStarted","Data":"b486faa2232878d8b6d40c031ca868064df082dc8ed30806988a98f5d8c2a120"} Apr 17 11:25:54.293029 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:25:54.292993 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-9hwhx" event={"ID":"752a2175-fdc0-4bc2-9946-3f222fd602f6","Type":"ContainerStarted","Data":"7b4cfe08abfa69085d4a187d846d46abea0a233d6b4e43480ab7e539d40b5d2e"} Apr 17 11:25:54.293417 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:25:54.293215 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-9hwhx" Apr 17 11:25:54.310908 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:25:54.310858 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-9hwhx" podStartSLOduration=1.833155834 podStartE2EDuration="4.310845224s" podCreationTimestamp="2026-04-17 11:25:50 +0000 UTC" firstStartedPulling="2026-04-17 11:25:51.086867857 +0000 UTC m=+560.029022076" lastFinishedPulling="2026-04-17 11:25:53.564557233 +0000 UTC m=+562.506711466" observedRunningTime="2026-04-17 11:25:54.30991835 +0000 UTC m=+563.252072592" watchObservedRunningTime="2026-04-17 11:25:54.310845224 +0000 UTC m=+563.252999465" Apr 17 11:26:05.298655 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:05.298625 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-9hwhx" Apr 17 11:26:06.102308 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:06.102270 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-wwz87"] Apr 17 11:26:06.108551 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:06.108526 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-wwz87" Apr 17 11:26:06.112532 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:06.112497 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-wwz87"] Apr 17 11:26:06.218160 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:06.218123 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbwwn\" (UniqueName: \"kubernetes.io/projected/8ca26738-9d31-4c23-9efc-3ef67a7d304c-kube-api-access-fbwwn\") pod \"s3-init-wwz87\" (UID: \"8ca26738-9d31-4c23-9efc-3ef67a7d304c\") " pod="kserve/s3-init-wwz87" Apr 17 11:26:06.318581 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:06.318541 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fbwwn\" (UniqueName: \"kubernetes.io/projected/8ca26738-9d31-4c23-9efc-3ef67a7d304c-kube-api-access-fbwwn\") pod \"s3-init-wwz87\" (UID: \"8ca26738-9d31-4c23-9efc-3ef67a7d304c\") " pod="kserve/s3-init-wwz87" Apr 17 11:26:06.327380 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:06.327350 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbwwn\" (UniqueName: \"kubernetes.io/projected/8ca26738-9d31-4c23-9efc-3ef67a7d304c-kube-api-access-fbwwn\") pod \"s3-init-wwz87\" (UID: \"8ca26738-9d31-4c23-9efc-3ef67a7d304c\") " pod="kserve/s3-init-wwz87" Apr 17 11:26:06.418616 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:06.418578 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-wwz87" Apr 17 11:26:06.538147 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:06.538097 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-wwz87"] Apr 17 11:26:06.540838 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:26:06.540792 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ca26738_9d31_4c23_9efc_3ef67a7d304c.slice/crio-463d0f10c7761eeac3b77b514de282b7568077c8c70f2b733dd3a9dded9e255c WatchSource:0}: Error finding container 463d0f10c7761eeac3b77b514de282b7568077c8c70f2b733dd3a9dded9e255c: Status 404 returned error can't find the container with id 463d0f10c7761eeac3b77b514de282b7568077c8c70f2b733dd3a9dded9e255c Apr 17 11:26:07.337198 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:07.337157 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-wwz87" event={"ID":"8ca26738-9d31-4c23-9efc-3ef67a7d304c","Type":"ContainerStarted","Data":"463d0f10c7761eeac3b77b514de282b7568077c8c70f2b733dd3a9dded9e255c"} Apr 17 11:26:11.352803 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:11.352765 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-wwz87" event={"ID":"8ca26738-9d31-4c23-9efc-3ef67a7d304c","Type":"ContainerStarted","Data":"50641b072d2e51c8a3179c7c47ba487adc8714a0ce1691ed676dab29f1aff20e"} Apr 17 11:26:11.368477 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:11.368404 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-wwz87" podStartSLOduration=0.781371181 podStartE2EDuration="5.36838248s" podCreationTimestamp="2026-04-17 11:26:06 +0000 UTC" firstStartedPulling="2026-04-17 11:26:06.542723012 +0000 UTC m=+575.484877231" lastFinishedPulling="2026-04-17 11:26:11.129734309 +0000 UTC m=+580.071888530" observedRunningTime="2026-04-17 11:26:11.36767512 +0000 UTC m=+580.309829361" watchObservedRunningTime="2026-04-17 11:26:11.36838248 +0000 UTC m=+580.310536722" Apr 17 11:26:14.362690 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:14.362651 2581 generic.go:358] "Generic (PLEG): container finished" podID="8ca26738-9d31-4c23-9efc-3ef67a7d304c" containerID="50641b072d2e51c8a3179c7c47ba487adc8714a0ce1691ed676dab29f1aff20e" exitCode=0 Apr 17 11:26:14.363160 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:14.362728 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-wwz87" event={"ID":"8ca26738-9d31-4c23-9efc-3ef67a7d304c","Type":"ContainerDied","Data":"50641b072d2e51c8a3179c7c47ba487adc8714a0ce1691ed676dab29f1aff20e"} Apr 17 11:26:15.494687 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:15.494663 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-wwz87" Apr 17 11:26:15.602067 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:15.602028 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbwwn\" (UniqueName: \"kubernetes.io/projected/8ca26738-9d31-4c23-9efc-3ef67a7d304c-kube-api-access-fbwwn\") pod \"8ca26738-9d31-4c23-9efc-3ef67a7d304c\" (UID: \"8ca26738-9d31-4c23-9efc-3ef67a7d304c\") " Apr 17 11:26:15.604263 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:15.604231 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ca26738-9d31-4c23-9efc-3ef67a7d304c-kube-api-access-fbwwn" (OuterVolumeSpecName: "kube-api-access-fbwwn") pod "8ca26738-9d31-4c23-9efc-3ef67a7d304c" (UID: "8ca26738-9d31-4c23-9efc-3ef67a7d304c"). InnerVolumeSpecName "kube-api-access-fbwwn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:26:15.702697 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:15.702657 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fbwwn\" (UniqueName: \"kubernetes.io/projected/8ca26738-9d31-4c23-9efc-3ef67a7d304c-kube-api-access-fbwwn\") on node \"ip-10-0-136-177.ec2.internal\" DevicePath \"\"" Apr 17 11:26:16.369734 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:16.369708 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-wwz87" Apr 17 11:26:16.369957 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:16.369705 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-wwz87" event={"ID":"8ca26738-9d31-4c23-9efc-3ef67a7d304c","Type":"ContainerDied","Data":"463d0f10c7761eeac3b77b514de282b7568077c8c70f2b733dd3a9dded9e255c"} Apr 17 11:26:16.369957 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:16.369811 2581 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="463d0f10c7761eeac3b77b514de282b7568077c8c70f2b733dd3a9dded9e255c" Apr 17 11:26:20.248495 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:20.248450 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-klv7k/must-gather-glnrg"] Apr 17 11:26:20.249026 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:20.248963 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ca26738-9d31-4c23-9efc-3ef67a7d304c" containerName="s3-init" Apr 17 11:26:20.249026 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:20.248981 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ca26738-9d31-4c23-9efc-3ef67a7d304c" containerName="s3-init" Apr 17 11:26:20.249140 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:20.249050 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="8ca26738-9d31-4c23-9efc-3ef67a7d304c" containerName="s3-init" Apr 17 11:26:20.252340 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:20.252315 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-klv7k/must-gather-glnrg" Apr 17 11:26:20.256875 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:20.256848 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-klv7k\"/\"openshift-service-ca.crt\"" Apr 17 11:26:20.257017 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:20.256921 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-klv7k\"/\"kube-root-ca.crt\"" Apr 17 11:26:20.257017 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:20.256970 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-klv7k\"/\"default-dockercfg-cks7h\"" Apr 17 11:26:20.268118 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:20.268087 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-klv7k/must-gather-glnrg"] Apr 17 11:26:20.342697 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:20.342647 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f9ed7d5a-53fa-425b-9879-ef04966bb6b1-must-gather-output\") pod \"must-gather-glnrg\" (UID: \"f9ed7d5a-53fa-425b-9879-ef04966bb6b1\") " pod="openshift-must-gather-klv7k/must-gather-glnrg" Apr 17 11:26:20.342917 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:20.342776 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vvl7\" (UniqueName: \"kubernetes.io/projected/f9ed7d5a-53fa-425b-9879-ef04966bb6b1-kube-api-access-8vvl7\") pod \"must-gather-glnrg\" (UID: \"f9ed7d5a-53fa-425b-9879-ef04966bb6b1\") " pod="openshift-must-gather-klv7k/must-gather-glnrg" Apr 17 11:26:20.443354 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:20.443298 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8vvl7\" (UniqueName: \"kubernetes.io/projected/f9ed7d5a-53fa-425b-9879-ef04966bb6b1-kube-api-access-8vvl7\") pod \"must-gather-glnrg\" (UID: \"f9ed7d5a-53fa-425b-9879-ef04966bb6b1\") " pod="openshift-must-gather-klv7k/must-gather-glnrg" Apr 17 11:26:20.443552 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:20.443384 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f9ed7d5a-53fa-425b-9879-ef04966bb6b1-must-gather-output\") pod \"must-gather-glnrg\" (UID: \"f9ed7d5a-53fa-425b-9879-ef04966bb6b1\") " pod="openshift-must-gather-klv7k/must-gather-glnrg" Apr 17 11:26:20.443722 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:20.443705 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f9ed7d5a-53fa-425b-9879-ef04966bb6b1-must-gather-output\") pod \"must-gather-glnrg\" (UID: \"f9ed7d5a-53fa-425b-9879-ef04966bb6b1\") " pod="openshift-must-gather-klv7k/must-gather-glnrg" Apr 17 11:26:20.459005 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:20.458980 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vvl7\" (UniqueName: \"kubernetes.io/projected/f9ed7d5a-53fa-425b-9879-ef04966bb6b1-kube-api-access-8vvl7\") pod \"must-gather-glnrg\" (UID: \"f9ed7d5a-53fa-425b-9879-ef04966bb6b1\") " pod="openshift-must-gather-klv7k/must-gather-glnrg" Apr 17 11:26:20.561782 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:20.561687 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-klv7k/must-gather-glnrg" Apr 17 11:26:20.686668 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:20.686642 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-klv7k/must-gather-glnrg"] Apr 17 11:26:20.688850 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:26:20.688810 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9ed7d5a_53fa_425b_9879_ef04966bb6b1.slice/crio-fd61d6208f02cc5bc31b484028a6f662b6273d287346e9196d257b6fe23fe8f2 WatchSource:0}: Error finding container fd61d6208f02cc5bc31b484028a6f662b6273d287346e9196d257b6fe23fe8f2: Status 404 returned error can't find the container with id fd61d6208f02cc5bc31b484028a6f662b6273d287346e9196d257b6fe23fe8f2 Apr 17 11:26:21.392075 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:21.392013 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-klv7k/must-gather-glnrg" event={"ID":"f9ed7d5a-53fa-425b-9879-ef04966bb6b1","Type":"ContainerStarted","Data":"fd61d6208f02cc5bc31b484028a6f662b6273d287346e9196d257b6fe23fe8f2"} Apr 17 11:26:25.407591 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:25.407551 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-klv7k/must-gather-glnrg" event={"ID":"f9ed7d5a-53fa-425b-9879-ef04966bb6b1","Type":"ContainerStarted","Data":"3fab8e109ba20c3a71c245e2d1413840c7eda19a273b6ae469ec3ee2a499ddfa"} Apr 17 11:26:25.407591 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:25.407594 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-klv7k/must-gather-glnrg" event={"ID":"f9ed7d5a-53fa-425b-9879-ef04966bb6b1","Type":"ContainerStarted","Data":"bae88cdb9ee40c73fe4cc7bc604d342503d1af05d9af5509e8cb290ddf3c3b8b"} Apr 17 11:26:25.426687 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:25.426623 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-klv7k/must-gather-glnrg" podStartSLOduration=1.142525504 podStartE2EDuration="5.426605333s" podCreationTimestamp="2026-04-17 11:26:20 +0000 UTC" firstStartedPulling="2026-04-17 11:26:20.690443811 +0000 UTC m=+589.632598029" lastFinishedPulling="2026-04-17 11:26:24.974523628 +0000 UTC m=+593.916677858" observedRunningTime="2026-04-17 11:26:25.423271192 +0000 UTC m=+594.365425435" watchObservedRunningTime="2026-04-17 11:26:25.426605333 +0000 UTC m=+594.368759575" Apr 17 11:26:45.477427 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:45.477393 2581 generic.go:358] "Generic (PLEG): container finished" podID="f9ed7d5a-53fa-425b-9879-ef04966bb6b1" containerID="bae88cdb9ee40c73fe4cc7bc604d342503d1af05d9af5509e8cb290ddf3c3b8b" exitCode=0 Apr 17 11:26:45.477873 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:45.477474 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-klv7k/must-gather-glnrg" event={"ID":"f9ed7d5a-53fa-425b-9879-ef04966bb6b1","Type":"ContainerDied","Data":"bae88cdb9ee40c73fe4cc7bc604d342503d1af05d9af5509e8cb290ddf3c3b8b"} Apr 17 11:26:45.477873 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:45.477788 2581 scope.go:117] "RemoveContainer" containerID="bae88cdb9ee40c73fe4cc7bc604d342503d1af05d9af5509e8cb290ddf3c3b8b" Apr 17 11:26:45.532395 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:45.532365 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-klv7k_must-gather-glnrg_f9ed7d5a-53fa-425b-9879-ef04966bb6b1/gather/0.log" Apr 17 11:26:48.804859 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:48.804808 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-t82fh_09f10455-02ae-4c95-91a9-6c0b6af2b02f/global-pull-secret-syncer/0.log" Apr 17 11:26:48.940144 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:48.940112 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-qxx8v_ba8cdaa8-b4aa-4821-9350-c03f9eb0b50d/konnectivity-agent/0.log" Apr 17 11:26:49.026200 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:49.026168 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-136-177.ec2.internal_f24d5b8e4287d839ebd797e520397e0b/haproxy/0.log" Apr 17 11:26:50.853674 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:50.853586 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-klv7k/must-gather-glnrg"] Apr 17 11:26:50.854114 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:50.853831 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-klv7k/must-gather-glnrg" podUID="f9ed7d5a-53fa-425b-9879-ef04966bb6b1" containerName="copy" containerID="cri-o://3fab8e109ba20c3a71c245e2d1413840c7eda19a273b6ae469ec3ee2a499ddfa" gracePeriod=2 Apr 17 11:26:50.856440 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:50.856396 2581 status_manager.go:895] "Failed to get status for pod" podUID="f9ed7d5a-53fa-425b-9879-ef04966bb6b1" pod="openshift-must-gather-klv7k/must-gather-glnrg" err="pods \"must-gather-glnrg\" is forbidden: User \"system:node:ip-10-0-136-177.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-klv7k\": no relationship found between node 'ip-10-0-136-177.ec2.internal' and this object" Apr 17 11:26:50.858078 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:50.857666 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-klv7k/must-gather-glnrg"] Apr 17 11:26:51.080470 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:51.080446 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-klv7k_must-gather-glnrg_f9ed7d5a-53fa-425b-9879-ef04966bb6b1/copy/0.log" Apr 17 11:26:51.080799 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:51.080782 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-klv7k/must-gather-glnrg" Apr 17 11:26:51.082960 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:51.082936 2581 status_manager.go:895] "Failed to get status for pod" podUID="f9ed7d5a-53fa-425b-9879-ef04966bb6b1" pod="openshift-must-gather-klv7k/must-gather-glnrg" err="pods \"must-gather-glnrg\" is forbidden: User \"system:node:ip-10-0-136-177.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-klv7k\": no relationship found between node 'ip-10-0-136-177.ec2.internal' and this object" Apr 17 11:26:51.088290 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:51.088274 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f9ed7d5a-53fa-425b-9879-ef04966bb6b1-must-gather-output\") pod \"f9ed7d5a-53fa-425b-9879-ef04966bb6b1\" (UID: \"f9ed7d5a-53fa-425b-9879-ef04966bb6b1\") " Apr 17 11:26:51.088361 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:51.088325 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vvl7\" (UniqueName: \"kubernetes.io/projected/f9ed7d5a-53fa-425b-9879-ef04966bb6b1-kube-api-access-8vvl7\") pod \"f9ed7d5a-53fa-425b-9879-ef04966bb6b1\" (UID: \"f9ed7d5a-53fa-425b-9879-ef04966bb6b1\") " Apr 17 11:26:51.089621 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:51.089593 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9ed7d5a-53fa-425b-9879-ef04966bb6b1-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "f9ed7d5a-53fa-425b-9879-ef04966bb6b1" (UID: "f9ed7d5a-53fa-425b-9879-ef04966bb6b1"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:26:51.090611 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:51.090595 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9ed7d5a-53fa-425b-9879-ef04966bb6b1-kube-api-access-8vvl7" (OuterVolumeSpecName: "kube-api-access-8vvl7") pod "f9ed7d5a-53fa-425b-9879-ef04966bb6b1" (UID: "f9ed7d5a-53fa-425b-9879-ef04966bb6b1"). InnerVolumeSpecName "kube-api-access-8vvl7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:26:51.189202 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:51.189160 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8vvl7\" (UniqueName: \"kubernetes.io/projected/f9ed7d5a-53fa-425b-9879-ef04966bb6b1-kube-api-access-8vvl7\") on node \"ip-10-0-136-177.ec2.internal\" DevicePath \"\"" Apr 17 11:26:51.189202 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:51.189196 2581 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f9ed7d5a-53fa-425b-9879-ef04966bb6b1-must-gather-output\") on node \"ip-10-0-136-177.ec2.internal\" DevicePath \"\"" Apr 17 11:26:51.496837 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:51.496734 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-klv7k_must-gather-glnrg_f9ed7d5a-53fa-425b-9879-ef04966bb6b1/copy/0.log" Apr 17 11:26:51.497102 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:51.497078 2581 generic.go:358] "Generic (PLEG): container finished" podID="f9ed7d5a-53fa-425b-9879-ef04966bb6b1" containerID="3fab8e109ba20c3a71c245e2d1413840c7eda19a273b6ae469ec3ee2a499ddfa" exitCode=143 Apr 17 11:26:51.497172 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:51.497133 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-klv7k/must-gather-glnrg" Apr 17 11:26:51.497172 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:51.497144 2581 scope.go:117] "RemoveContainer" containerID="3fab8e109ba20c3a71c245e2d1413840c7eda19a273b6ae469ec3ee2a499ddfa" Apr 17 11:26:51.499528 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:51.499495 2581 status_manager.go:895] "Failed to get status for pod" podUID="f9ed7d5a-53fa-425b-9879-ef04966bb6b1" pod="openshift-must-gather-klv7k/must-gather-glnrg" err="pods \"must-gather-glnrg\" is forbidden: User \"system:node:ip-10-0-136-177.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-klv7k\": no relationship found between node 'ip-10-0-136-177.ec2.internal' and this object" Apr 17 11:26:51.505918 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:51.505858 2581 scope.go:117] "RemoveContainer" containerID="bae88cdb9ee40c73fe4cc7bc604d342503d1af05d9af5509e8cb290ddf3c3b8b" Apr 17 11:26:51.508633 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:51.508604 2581 status_manager.go:895] "Failed to get status for pod" podUID="f9ed7d5a-53fa-425b-9879-ef04966bb6b1" pod="openshift-must-gather-klv7k/must-gather-glnrg" err="pods \"must-gather-glnrg\" is forbidden: User \"system:node:ip-10-0-136-177.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-klv7k\": no relationship found between node 'ip-10-0-136-177.ec2.internal' and this object" Apr 17 11:26:51.517364 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:51.517344 2581 scope.go:117] "RemoveContainer" containerID="3fab8e109ba20c3a71c245e2d1413840c7eda19a273b6ae469ec3ee2a499ddfa" Apr 17 11:26:51.517623 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:26:51.517606 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fab8e109ba20c3a71c245e2d1413840c7eda19a273b6ae469ec3ee2a499ddfa\": container with ID starting with 3fab8e109ba20c3a71c245e2d1413840c7eda19a273b6ae469ec3ee2a499ddfa not found: ID does not exist" containerID="3fab8e109ba20c3a71c245e2d1413840c7eda19a273b6ae469ec3ee2a499ddfa" Apr 17 11:26:51.517671 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:51.517632 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fab8e109ba20c3a71c245e2d1413840c7eda19a273b6ae469ec3ee2a499ddfa"} err="failed to get container status \"3fab8e109ba20c3a71c245e2d1413840c7eda19a273b6ae469ec3ee2a499ddfa\": rpc error: code = NotFound desc = could not find container \"3fab8e109ba20c3a71c245e2d1413840c7eda19a273b6ae469ec3ee2a499ddfa\": container with ID starting with 3fab8e109ba20c3a71c245e2d1413840c7eda19a273b6ae469ec3ee2a499ddfa not found: ID does not exist" Apr 17 11:26:51.517671 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:51.517651 2581 scope.go:117] "RemoveContainer" containerID="bae88cdb9ee40c73fe4cc7bc604d342503d1af05d9af5509e8cb290ddf3c3b8b" Apr 17 11:26:51.517918 ip-10-0-136-177 kubenswrapper[2581]: E0417 11:26:51.517890 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bae88cdb9ee40c73fe4cc7bc604d342503d1af05d9af5509e8cb290ddf3c3b8b\": container with ID starting with bae88cdb9ee40c73fe4cc7bc604d342503d1af05d9af5509e8cb290ddf3c3b8b not found: ID does not exist" containerID="bae88cdb9ee40c73fe4cc7bc604d342503d1af05d9af5509e8cb290ddf3c3b8b" Apr 17 11:26:51.517918 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:51.517909 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bae88cdb9ee40c73fe4cc7bc604d342503d1af05d9af5509e8cb290ddf3c3b8b"} err="failed to get container status \"bae88cdb9ee40c73fe4cc7bc604d342503d1af05d9af5509e8cb290ddf3c3b8b\": rpc error: code = NotFound desc = could not find container \"bae88cdb9ee40c73fe4cc7bc604d342503d1af05d9af5509e8cb290ddf3c3b8b\": container with ID starting with bae88cdb9ee40c73fe4cc7bc604d342503d1af05d9af5509e8cb290ddf3c3b8b not found: ID does not exist" Apr 17 11:26:51.661125 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:51.661089 2581 status_manager.go:895] "Failed to get status for pod" podUID="f9ed7d5a-53fa-425b-9879-ef04966bb6b1" pod="openshift-must-gather-klv7k/must-gather-glnrg" err="pods \"must-gather-glnrg\" is forbidden: User \"system:node:ip-10-0-136-177.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-klv7k\": no relationship found between node 'ip-10-0-136-177.ec2.internal' and this object" Apr 17 11:26:51.661728 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:51.661709 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9ed7d5a-53fa-425b-9879-ef04966bb6b1" path="/var/lib/kubelet/pods/f9ed7d5a-53fa-425b-9879-ef04966bb6b1/volumes" Apr 17 11:26:52.553567 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:52.553536 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-bff94c98c-8ffzl_73a3d378-3878-418a-9279-539c643fab5a/metrics-server/0.log" Apr 17 11:26:52.581812 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:52.581781 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-776fv_4b416cfd-b1d8-4cfb-a5f3-9d2cf17e642b/monitoring-plugin/0.log" Apr 17 11:26:52.693810 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:52.693780 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lwtkb_33907f69-05ed-46d3-a4c7-b24d3165e1e7/node-exporter/0.log" Apr 17 11:26:52.718112 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:52.718088 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lwtkb_33907f69-05ed-46d3-a4c7-b24d3165e1e7/kube-rbac-proxy/0.log" Apr 17 11:26:52.741199 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:52.741174 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lwtkb_33907f69-05ed-46d3-a4c7-b24d3165e1e7/init-textfile/0.log" Apr 17 11:26:53.186406 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:53.186376 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-d6d28_aa810299-6a4d-4ef5-b2b3-abe18f37385c/prometheus-operator/0.log" Apr 17 11:26:53.260855 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:53.260806 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-d6d28_aa810299-6a4d-4ef5-b2b3-abe18f37385c/kube-rbac-proxy/0.log" Apr 17 11:26:53.431113 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:53.431080 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-dccb875d5-dgb9v_4121857a-9f7b-48f9-83c8-b78509c1d47f/thanos-query/0.log" Apr 17 11:26:53.458011 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:53.457938 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-dccb875d5-dgb9v_4121857a-9f7b-48f9-83c8-b78509c1d47f/kube-rbac-proxy-web/0.log" Apr 17 11:26:53.483086 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:53.483061 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-dccb875d5-dgb9v_4121857a-9f7b-48f9-83c8-b78509c1d47f/kube-rbac-proxy/0.log" Apr 17 11:26:53.509508 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:53.509477 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-dccb875d5-dgb9v_4121857a-9f7b-48f9-83c8-b78509c1d47f/prom-label-proxy/0.log" Apr 17 11:26:53.534543 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:53.534504 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-dccb875d5-dgb9v_4121857a-9f7b-48f9-83c8-b78509c1d47f/kube-rbac-proxy-rules/0.log" Apr 17 11:26:53.561253 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:53.561222 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-dccb875d5-dgb9v_4121857a-9f7b-48f9-83c8-b78509c1d47f/kube-rbac-proxy-metrics/0.log" Apr 17 11:26:54.825194 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:54.825159 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-zdr26_76699d2b-150e-4629-9c16-03548712a64f/networking-console-plugin/0.log" Apr 17 11:26:56.097139 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:56.097104 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vp6ck/perf-node-gather-daemonset-vctpp"] Apr 17 11:26:56.097512 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:56.097408 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f9ed7d5a-53fa-425b-9879-ef04966bb6b1" containerName="gather" Apr 17 11:26:56.097512 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:56.097419 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9ed7d5a-53fa-425b-9879-ef04966bb6b1" containerName="gather" Apr 17 11:26:56.097512 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:56.097438 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f9ed7d5a-53fa-425b-9879-ef04966bb6b1" containerName="copy" Apr 17 11:26:56.097512 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:56.097443 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9ed7d5a-53fa-425b-9879-ef04966bb6b1" containerName="copy" Apr 17 11:26:56.097512 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:56.097493 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="f9ed7d5a-53fa-425b-9879-ef04966bb6b1" containerName="copy" Apr 17 11:26:56.097512 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:56.097503 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="f9ed7d5a-53fa-425b-9879-ef04966bb6b1" containerName="gather" Apr 17 11:26:56.100338 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:56.100321 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-vctpp" Apr 17 11:26:56.103393 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:56.103373 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vp6ck\"/\"openshift-service-ca.crt\"" Apr 17 11:26:56.103503 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:56.103432 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vp6ck\"/\"kube-root-ca.crt\"" Apr 17 11:26:56.103503 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:56.103488 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-vp6ck\"/\"default-dockercfg-hjq8z\"" Apr 17 11:26:56.119303 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:56.119275 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vp6ck/perf-node-gather-daemonset-vctpp"] Apr 17 11:26:56.126170 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:56.126144 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmbw9\" (UniqueName: \"kubernetes.io/projected/c89c3222-3261-4637-a534-fe9cc8804e0b-kube-api-access-mmbw9\") pod \"perf-node-gather-daemonset-vctpp\" (UID: \"c89c3222-3261-4637-a534-fe9cc8804e0b\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-vctpp" Apr 17 11:26:56.126281 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:56.126194 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c89c3222-3261-4637-a534-fe9cc8804e0b-podres\") pod \"perf-node-gather-daemonset-vctpp\" (UID: \"c89c3222-3261-4637-a534-fe9cc8804e0b\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-vctpp" Apr 17 11:26:56.126281 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:56.126265 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c89c3222-3261-4637-a534-fe9cc8804e0b-proc\") pod \"perf-node-gather-daemonset-vctpp\" (UID: \"c89c3222-3261-4637-a534-fe9cc8804e0b\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-vctpp" Apr 17 11:26:56.126363 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:56.126318 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c89c3222-3261-4637-a534-fe9cc8804e0b-lib-modules\") pod \"perf-node-gather-daemonset-vctpp\" (UID: \"c89c3222-3261-4637-a534-fe9cc8804e0b\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-vctpp" Apr 17 11:26:56.126363 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:56.126347 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c89c3222-3261-4637-a534-fe9cc8804e0b-sys\") pod \"perf-node-gather-daemonset-vctpp\" (UID: \"c89c3222-3261-4637-a534-fe9cc8804e0b\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-vctpp" Apr 17 11:26:56.227380 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:56.227341 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c89c3222-3261-4637-a534-fe9cc8804e0b-proc\") pod \"perf-node-gather-daemonset-vctpp\" (UID: \"c89c3222-3261-4637-a534-fe9cc8804e0b\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-vctpp" Apr 17 11:26:56.227380 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:56.227384 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c89c3222-3261-4637-a534-fe9cc8804e0b-lib-modules\") pod \"perf-node-gather-daemonset-vctpp\" (UID: \"c89c3222-3261-4637-a534-fe9cc8804e0b\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-vctpp" Apr 17 11:26:56.227637 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:56.227402 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c89c3222-3261-4637-a534-fe9cc8804e0b-sys\") pod \"perf-node-gather-daemonset-vctpp\" (UID: \"c89c3222-3261-4637-a534-fe9cc8804e0b\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-vctpp" Apr 17 11:26:56.227637 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:56.227431 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mmbw9\" (UniqueName: \"kubernetes.io/projected/c89c3222-3261-4637-a534-fe9cc8804e0b-kube-api-access-mmbw9\") pod \"perf-node-gather-daemonset-vctpp\" (UID: \"c89c3222-3261-4637-a534-fe9cc8804e0b\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-vctpp" Apr 17 11:26:56.227637 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:56.227471 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c89c3222-3261-4637-a534-fe9cc8804e0b-podres\") pod \"perf-node-gather-daemonset-vctpp\" (UID: \"c89c3222-3261-4637-a534-fe9cc8804e0b\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-vctpp" Apr 17 11:26:56.227637 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:56.227466 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c89c3222-3261-4637-a534-fe9cc8804e0b-proc\") pod \"perf-node-gather-daemonset-vctpp\" (UID: \"c89c3222-3261-4637-a534-fe9cc8804e0b\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-vctpp" Apr 17 11:26:56.227637 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:56.227503 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c89c3222-3261-4637-a534-fe9cc8804e0b-sys\") pod \"perf-node-gather-daemonset-vctpp\" (UID: \"c89c3222-3261-4637-a534-fe9cc8804e0b\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-vctpp" Apr 17 11:26:56.227637 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:56.227528 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c89c3222-3261-4637-a534-fe9cc8804e0b-lib-modules\") pod \"perf-node-gather-daemonset-vctpp\" (UID: \"c89c3222-3261-4637-a534-fe9cc8804e0b\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-vctpp" Apr 17 11:26:56.227637 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:56.227599 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c89c3222-3261-4637-a534-fe9cc8804e0b-podres\") pod \"perf-node-gather-daemonset-vctpp\" (UID: \"c89c3222-3261-4637-a534-fe9cc8804e0b\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-vctpp" Apr 17 11:26:56.237212 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:56.237180 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmbw9\" (UniqueName: \"kubernetes.io/projected/c89c3222-3261-4637-a534-fe9cc8804e0b-kube-api-access-mmbw9\") pod \"perf-node-gather-daemonset-vctpp\" (UID: \"c89c3222-3261-4637-a534-fe9cc8804e0b\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-vctpp" Apr 17 11:26:56.410324 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:56.410275 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-vctpp" Apr 17 11:26:56.537758 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:56.537729 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vp6ck/perf-node-gather-daemonset-vctpp"] Apr 17 11:26:56.540240 ip-10-0-136-177 kubenswrapper[2581]: W0417 11:26:56.540197 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc89c3222_3261_4637_a534_fe9cc8804e0b.slice/crio-a3164c1b057e10115280f4b0342da351c76ea164344c5830cedcf655755d5072 WatchSource:0}: Error finding container a3164c1b057e10115280f4b0342da351c76ea164344c5830cedcf655755d5072: Status 404 returned error can't find the container with id a3164c1b057e10115280f4b0342da351c76ea164344c5830cedcf655755d5072 Apr 17 11:26:57.229924 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:57.229892 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rpkz7_2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3/dns/0.log" Apr 17 11:26:57.279052 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:57.279013 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rpkz7_2ae9be6b-7bb8-4ea9-b282-621f3e3bb4a3/kube-rbac-proxy/0.log" Apr 17 11:26:57.395035 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:57.395004 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-8mkn6_5ee7c1ba-cbb5-4b2b-b2f7-23d447b10b96/dns-node-resolver/0.log" Apr 17 11:26:57.517935 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:57.517845 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-vctpp" event={"ID":"c89c3222-3261-4637-a534-fe9cc8804e0b","Type":"ContainerStarted","Data":"8aa023c2cb75004dd3e08bb63c9a7092279e494d72da28f87d9c3e2d5bb4a057"} Apr 17 11:26:57.517935 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:57.517888 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-vctpp" event={"ID":"c89c3222-3261-4637-a534-fe9cc8804e0b","Type":"ContainerStarted","Data":"a3164c1b057e10115280f4b0342da351c76ea164344c5830cedcf655755d5072"} Apr 17 11:26:57.518133 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:57.517988 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-vctpp" Apr 17 11:26:57.540740 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:57.540686 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-vctpp" podStartSLOduration=1.540670586 podStartE2EDuration="1.540670586s" podCreationTimestamp="2026-04-17 11:26:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:26:57.5391084 +0000 UTC m=+626.481262638" watchObservedRunningTime="2026-04-17 11:26:57.540670586 +0000 UTC m=+626.482824826" Apr 17 11:26:58.014019 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:58.013976 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-sdhv4_3b8f2274-31f6-413e-944a-132f5e8db8f6/node-ca/0.log" Apr 17 11:26:58.897695 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:58.897668 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5f8c5cdcb8-ts845_d77ac6f7-5bf1-4c3d-88ac-46e8ef9f8707/router/0.log" Apr 17 11:26:59.312987 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:59.312913 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-7wbxp_8fe4a1b8-871d-4f96-95f0-946d542180da/serve-healthcheck-canary/0.log" Apr 17 11:26:59.998453 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:26:59.998423 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wpkcf_5870bfff-e1f4-4430-8513-d97e5bb1dcaa/kube-rbac-proxy/0.log" Apr 17 11:27:00.024338 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:27:00.024307 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wpkcf_5870bfff-e1f4-4430-8513-d97e5bb1dcaa/exporter/0.log" Apr 17 11:27:00.049846 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:27:00.049804 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wpkcf_5870bfff-e1f4-4430-8513-d97e5bb1dcaa/extractor/0.log" Apr 17 11:27:02.084418 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:27:02.084392 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-dpx74_f270178e-46a1-4abc-a804-f86a39b8dd51/manager/0.log" Apr 17 11:27:02.145948 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:27:02.145919 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-9hwhx_752a2175-fdc0-4bc2-9946-3f222fd602f6/manager/0.log" Apr 17 11:27:02.168042 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:27:02.168014 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-wwz87_8ca26738-9d31-4c23-9efc-3ef67a7d304c/s3-init/0.log" Apr 17 11:27:02.200361 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:27:02.200325 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-6zs2d_41b5af85-5a6d-4a53-a59d-2fddbd2b8d95/seaweedfs/0.log" Apr 17 11:27:03.531740 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:27:03.531712 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-vctpp" Apr 17 11:27:06.339627 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:27:06.339550 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-m5hsm_4a477a56-2449-459a-8d09-6f1648b29153/migrator/0.log" Apr 17 11:27:06.363742 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:27:06.363711 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-m5hsm_4a477a56-2449-459a-8d09-6f1648b29153/graceful-termination/0.log" Apr 17 11:27:08.084014 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:27:08.083986 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b92zr_a3311f8e-8452-4224-8b40-1d0392b66a65/kube-multus-additional-cni-plugins/0.log" Apr 17 11:27:08.112460 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:27:08.112430 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b92zr_a3311f8e-8452-4224-8b40-1d0392b66a65/egress-router-binary-copy/0.log" Apr 17 11:27:08.137790 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:27:08.137760 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b92zr_a3311f8e-8452-4224-8b40-1d0392b66a65/cni-plugins/0.log" Apr 17 11:27:08.162772 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:27:08.162705 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b92zr_a3311f8e-8452-4224-8b40-1d0392b66a65/bond-cni-plugin/0.log" Apr 17 11:27:08.188403 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:27:08.188379 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b92zr_a3311f8e-8452-4224-8b40-1d0392b66a65/routeoverride-cni/0.log" Apr 17 11:27:08.213150 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:27:08.213119 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b92zr_a3311f8e-8452-4224-8b40-1d0392b66a65/whereabouts-cni-bincopy/0.log" Apr 17 11:27:08.242723 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:27:08.242691 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b92zr_a3311f8e-8452-4224-8b40-1d0392b66a65/whereabouts-cni/0.log" Apr 17 11:27:08.485016 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:27:08.484945 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-szct7_2d3b1e8c-4d88-46ef-95e8-c7034cf6ec2b/kube-multus/0.log" Apr 17 11:27:08.544788 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:27:08.544764 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-9k5nh_bee0bc88-7732-4010-9886-3df7384bf1c8/network-metrics-daemon/0.log" Apr 17 11:27:08.567979 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:27:08.567957 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-9k5nh_bee0bc88-7732-4010-9886-3df7384bf1c8/kube-rbac-proxy/0.log" Apr 17 11:27:10.217483 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:27:10.217453 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q4q4t_4ab6c44c-fe99-4f3a-a19e-93959e1d3d56/ovn-controller/0.log" Apr 17 11:27:10.245505 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:27:10.245476 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q4q4t_4ab6c44c-fe99-4f3a-a19e-93959e1d3d56/ovn-acl-logging/0.log" Apr 17 11:27:10.268400 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:27:10.268376 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q4q4t_4ab6c44c-fe99-4f3a-a19e-93959e1d3d56/kube-rbac-proxy-node/0.log" Apr 17 11:27:10.293984 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:27:10.293956 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q4q4t_4ab6c44c-fe99-4f3a-a19e-93959e1d3d56/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 11:27:10.318809 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:27:10.318783 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q4q4t_4ab6c44c-fe99-4f3a-a19e-93959e1d3d56/northd/0.log" Apr 17 11:27:10.420315 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:27:10.420283 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q4q4t_4ab6c44c-fe99-4f3a-a19e-93959e1d3d56/nbdb/0.log" Apr 17 11:27:10.447837 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:27:10.447797 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q4q4t_4ab6c44c-fe99-4f3a-a19e-93959e1d3d56/sbdb/0.log" Apr 17 11:27:10.556188 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:27:10.556113 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q4q4t_4ab6c44c-fe99-4f3a-a19e-93959e1d3d56/ovnkube-controller/0.log" Apr 17 11:27:11.761359 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:27:11.761326 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-rdxhf_1679a0d1-a3f2-40a1-aa7a-e0d8183e0f7f/network-check-target-container/0.log" Apr 17 11:27:12.692631 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:27:12.692600 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-fvtct_505d1e1b-000d-4203-b021-c56c5c5d8c56/iptables-alerter/0.log" Apr 17 11:27:13.468290 ip-10-0-136-177 kubenswrapper[2581]: I0417 11:27:13.468254 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-ldnwz_38f8cb42-d739-4806-98ed-206508f9cc9c/tuned/0.log"