Apr 16 19:18:12.386504 ip-10-0-130-83 systemd[1]: Starting Kubernetes Kubelet... Apr 16 19:18:12.847306 ip-10-0-130-83 kubenswrapper[2580]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:18:12.847306 ip-10-0-130-83 kubenswrapper[2580]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 19:18:12.847306 ip-10-0-130-83 kubenswrapper[2580]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:18:12.847306 ip-10-0-130-83 kubenswrapper[2580]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 19:18:12.847306 ip-10-0-130-83 kubenswrapper[2580]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:18:12.849741 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.849653 2580 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 19:18:12.852072 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852052 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:18:12.852072 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852072 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:18:12.852142 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852076 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:18:12.852142 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852080 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:18:12.852142 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852084 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:18:12.852142 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852087 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:18:12.852142 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852089 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:18:12.852142 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852092 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:18:12.852142 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852095 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:18:12.852142 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852097 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:18:12.852142 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852100 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:18:12.852142 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852104 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:18:12.852142 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852108 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:18:12.852142 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852112 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:18:12.852142 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852121 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:18:12.852142 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852124 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:18:12.852142 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852127 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:18:12.852142 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852130 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:18:12.852142 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852133 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:18:12.852142 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852135 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:18:12.852142 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852138 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:18:12.852142 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852141 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:18:12.852683 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852143 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:18:12.852683 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852146 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:18:12.852683 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852149 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:18:12.852683 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852152 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:18:12.852683 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852155 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:18:12.852683 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852158 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:18:12.852683 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852160 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:18:12.852683 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852163 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:18:12.852683 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852165 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:18:12.852683 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852168 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:18:12.852683 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852170 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:18:12.852683 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852173 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:18:12.852683 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852175 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:18:12.852683 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852178 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:18:12.852683 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852180 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:18:12.852683 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852183 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:18:12.852683 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852200 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:18:12.852683 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852202 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:18:12.852683 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852205 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:18:12.852683 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852208 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:18:12.853182 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852211 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:18:12.853182 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852213 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:18:12.853182 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852216 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:18:12.853182 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852219 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:18:12.853182 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852222 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:18:12.853182 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852224 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:18:12.853182 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852227 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:18:12.853182 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852229 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:18:12.853182 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852232 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:18:12.853182 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852234 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:18:12.853182 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852237 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:18:12.853182 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852239 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:18:12.853182 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852242 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:18:12.853182 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852246 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:18:12.853182 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852249 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:18:12.853182 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852252 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:18:12.853182 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852256 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:18:12.853182 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852258 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:18:12.853182 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852261 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:18:12.853182 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852269 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:18:12.853685 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852273 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:18:12.853685 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852275 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:18:12.853685 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852278 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:18:12.853685 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852286 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:18:12.853685 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852290 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:18:12.853685 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852293 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:18:12.853685 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852295 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:18:12.853685 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852298 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:18:12.853685 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852301 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:18:12.853685 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852303 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:18:12.853685 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852312 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:18:12.853685 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852315 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:18:12.853685 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852317 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:18:12.853685 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852320 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:18:12.853685 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852323 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:18:12.853685 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852325 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:18:12.853685 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852328 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:18:12.853685 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852331 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:18:12.853685 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852333 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:18:12.854262 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852336 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:18:12.854262 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852338 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:18:12.854262 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852341 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:18:12.854262 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852343 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:18:12.854262 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852346 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:18:12.854262 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852797 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:18:12.854262 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852803 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:18:12.854262 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852807 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:18:12.854262 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852810 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:18:12.854262 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852812 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:18:12.854262 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852815 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:18:12.854262 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852818 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:18:12.854262 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852821 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:18:12.854262 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852823 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:18:12.854262 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852826 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:18:12.854262 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852835 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:18:12.854262 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852837 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:18:12.854262 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852840 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:18:12.854262 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852843 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:18:12.854722 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852845 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:18:12.854722 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852848 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:18:12.854722 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852850 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:18:12.854722 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852853 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:18:12.854722 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852862 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:18:12.854722 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852865 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:18:12.854722 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852867 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:18:12.854722 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852870 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:18:12.854722 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852872 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:18:12.854722 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852875 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:18:12.854722 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852877 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:18:12.854722 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852880 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:18:12.854722 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852883 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:18:12.854722 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852885 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:18:12.854722 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852888 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:18:12.854722 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852892 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:18:12.854722 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852894 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:18:12.854722 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852897 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:18:12.854722 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852899 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:18:12.855245 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852902 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:18:12.855245 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852905 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:18:12.855245 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852908 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:18:12.855245 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852910 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:18:12.855245 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852913 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:18:12.855245 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852915 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:18:12.855245 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852918 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:18:12.855245 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852920 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:18:12.855245 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852923 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:18:12.855245 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852926 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:18:12.855245 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852928 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:18:12.855245 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852931 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:18:12.855245 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852933 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:18:12.855245 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852936 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:18:12.855245 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852939 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:18:12.855245 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852941 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:18:12.855245 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852944 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:18:12.855245 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852947 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:18:12.855245 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852955 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:18:12.855718 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852957 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:18:12.855718 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852960 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:18:12.855718 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852962 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:18:12.855718 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852965 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:18:12.855718 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852967 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:18:12.855718 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852970 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:18:12.855718 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852973 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:18:12.855718 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852975 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:18:12.855718 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852978 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:18:12.855718 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852980 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:18:12.855718 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852983 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:18:12.855718 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852986 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:18:12.855718 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852988 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:18:12.855718 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852991 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:18:12.855718 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852994 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:18:12.855718 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852996 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:18:12.855718 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.852998 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:18:12.855718 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.853001 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:18:12.855718 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.853004 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:18:12.855718 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.853006 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:18:12.856224 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.853008 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:18:12.856224 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.853011 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:18:12.856224 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.853013 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:18:12.856224 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.853018 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:18:12.856224 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.853022 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:18:12.856224 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.853025 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:18:12.856224 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.853028 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:18:12.856224 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.853030 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:18:12.856224 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.853033 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:18:12.856224 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.853036 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:18:12.856224 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.853038 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:18:12.856224 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.853041 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:18:12.856224 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.853050 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:18:12.856224 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.853053 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:18:12.856224 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853667 2580 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 19:18:12.856224 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853682 2580 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 19:18:12.856224 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853691 2580 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 19:18:12.856224 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853696 2580 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 19:18:12.856224 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853701 2580 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 19:18:12.856224 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853704 2580 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 19:18:12.856224 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853709 2580 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 19:18:12.856754 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853718 2580 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 19:18:12.856754 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853721 2580 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 19:18:12.856754 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853724 2580 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 19:18:12.856754 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853728 2580 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 19:18:12.856754 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853731 2580 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 19:18:12.856754 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853734 2580 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 19:18:12.856754 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853737 2580 flags.go:64] FLAG: --cgroup-root="" Apr 16 19:18:12.856754 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853740 2580 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 19:18:12.856754 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853743 2580 flags.go:64] FLAG: --client-ca-file="" Apr 16 19:18:12.856754 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853747 2580 flags.go:64] FLAG: --cloud-config="" Apr 16 19:18:12.856754 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853750 2580 flags.go:64] FLAG: --cloud-provider="external" Apr 16 19:18:12.856754 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853753 2580 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 19:18:12.856754 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853761 2580 flags.go:64] FLAG: --cluster-domain="" Apr 16 19:18:12.856754 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853764 2580 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 19:18:12.856754 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853767 2580 flags.go:64] FLAG: --config-dir="" Apr 16 19:18:12.856754 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853770 2580 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 19:18:12.856754 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853774 2580 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 19:18:12.856754 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853778 2580 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 19:18:12.856754 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853781 2580 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 19:18:12.856754 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853784 2580 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 19:18:12.856754 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853789 2580 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 19:18:12.856754 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853792 2580 flags.go:64] FLAG: --contention-profiling="false" Apr 16 19:18:12.856754 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853795 2580 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 19:18:12.856754 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853798 2580 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 19:18:12.857351 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853807 2580 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 19:18:12.857351 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853810 2580 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 19:18:12.857351 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853814 2580 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 19:18:12.857351 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853817 2580 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 19:18:12.857351 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853821 2580 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 19:18:12.857351 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853824 2580 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 19:18:12.857351 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853827 2580 flags.go:64] FLAG: --enable-server="true" Apr 16 19:18:12.857351 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853830 2580 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 19:18:12.857351 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853839 2580 flags.go:64] FLAG: --event-burst="100" Apr 16 19:18:12.857351 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853842 2580 flags.go:64] FLAG: --event-qps="50" Apr 16 19:18:12.857351 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853845 2580 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 19:18:12.857351 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853848 2580 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 19:18:12.857351 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853851 2580 flags.go:64] FLAG: --eviction-hard="" Apr 16 19:18:12.857351 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853855 2580 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 19:18:12.857351 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853858 2580 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 19:18:12.857351 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853861 2580 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 19:18:12.857351 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853864 2580 flags.go:64] FLAG: --eviction-soft="" Apr 16 19:18:12.857351 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853867 2580 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 19:18:12.857351 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853870 2580 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 19:18:12.857351 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853873 2580 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 19:18:12.857351 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853876 2580 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 19:18:12.857351 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853879 2580 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 19:18:12.857351 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853882 2580 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 19:18:12.857351 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853884 2580 flags.go:64] FLAG: --feature-gates="" Apr 16 19:18:12.857351 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853888 2580 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 19:18:12.857951 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853891 2580 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 19:18:12.857951 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853894 2580 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 19:18:12.857951 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853898 2580 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 19:18:12.857951 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853902 2580 flags.go:64] FLAG: --healthz-port="10248" Apr 16 19:18:12.857951 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853905 2580 flags.go:64] FLAG: --help="false" Apr 16 19:18:12.857951 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853907 2580 flags.go:64] FLAG: --hostname-override="ip-10-0-130-83.ec2.internal" Apr 16 19:18:12.857951 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853911 2580 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 19:18:12.857951 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853914 2580 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 19:18:12.857951 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853925 2580 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 19:18:12.857951 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853928 2580 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 19:18:12.857951 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853932 2580 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 19:18:12.857951 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853935 2580 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 19:18:12.857951 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853938 2580 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 19:18:12.857951 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853940 2580 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 19:18:12.857951 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853943 2580 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 19:18:12.857951 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853948 2580 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 19:18:12.857951 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853951 2580 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 19:18:12.857951 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853954 2580 flags.go:64] FLAG: --kube-reserved="" Apr 16 19:18:12.857951 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853957 2580 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 19:18:12.857951 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853960 2580 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 19:18:12.857951 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853963 2580 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 19:18:12.857951 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853966 2580 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 19:18:12.857951 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853969 2580 flags.go:64] FLAG: --lock-file="" Apr 16 19:18:12.857951 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853972 2580 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 19:18:12.859136 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853975 2580 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 19:18:12.859136 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853977 2580 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 19:18:12.859136 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853983 2580 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 19:18:12.859136 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853993 2580 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 19:18:12.859136 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.853997 2580 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 19:18:12.859136 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854000 2580 flags.go:64] FLAG: --logging-format="text" Apr 16 19:18:12.859136 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854003 2580 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 19:18:12.859136 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854007 2580 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 19:18:12.859136 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854010 2580 flags.go:64] FLAG: --manifest-url="" Apr 16 19:18:12.859136 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854013 2580 flags.go:64] FLAG: --manifest-url-header="" Apr 16 19:18:12.859136 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854017 2580 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 19:18:12.859136 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854022 2580 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 19:18:12.859136 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854027 2580 flags.go:64] FLAG: --max-pods="110" Apr 16 19:18:12.859136 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854030 2580 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 19:18:12.859136 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854033 2580 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 19:18:12.859136 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854036 2580 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 19:18:12.859136 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854039 2580 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 19:18:12.859136 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854048 2580 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 19:18:12.859136 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854052 2580 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 19:18:12.859136 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854055 2580 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 19:18:12.859136 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854064 2580 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 19:18:12.859136 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854067 2580 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 19:18:12.859136 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854070 2580 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 19:18:12.859136 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854074 2580 flags.go:64] FLAG: --pod-cidr="" Apr 16 19:18:12.859747 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854077 2580 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 19:18:12.859747 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854084 2580 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 19:18:12.859747 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854087 2580 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 19:18:12.859747 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854090 2580 flags.go:64] FLAG: --pods-per-core="0" Apr 16 19:18:12.859747 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854093 2580 flags.go:64] FLAG: --port="10250" Apr 16 19:18:12.859747 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854096 2580 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 19:18:12.859747 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854099 2580 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0ded8574e58137a7b" Apr 16 19:18:12.859747 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854102 2580 flags.go:64] FLAG: --qos-reserved="" Apr 16 19:18:12.859747 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854105 2580 flags.go:64] FLAG: --read-only-port="10255" Apr 16 19:18:12.859747 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854108 2580 flags.go:64] FLAG: --register-node="true" Apr 16 19:18:12.859747 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854111 2580 flags.go:64] FLAG: --register-schedulable="true" Apr 16 19:18:12.859747 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854114 2580 flags.go:64] FLAG: --register-with-taints="" Apr 16 19:18:12.859747 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854118 2580 flags.go:64] FLAG: --registry-burst="10" Apr 16 19:18:12.859747 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854121 2580 flags.go:64] FLAG: --registry-qps="5" Apr 16 19:18:12.859747 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854124 2580 flags.go:64] FLAG: --reserved-cpus="" Apr 16 19:18:12.859747 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854127 2580 flags.go:64] FLAG: --reserved-memory="" Apr 16 19:18:12.859747 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854130 2580 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 19:18:12.859747 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854134 2580 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 19:18:12.859747 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854137 2580 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 19:18:12.859747 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854141 2580 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 19:18:12.859747 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854144 2580 flags.go:64] FLAG: --runonce="false" Apr 16 19:18:12.859747 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854147 2580 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 19:18:12.859747 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854150 2580 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 19:18:12.859747 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854154 2580 flags.go:64] FLAG: --seccomp-default="false" Apr 16 19:18:12.859747 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854157 2580 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 19:18:12.860371 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854160 2580 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 19:18:12.860371 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854169 2580 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 19:18:12.860371 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854172 2580 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 19:18:12.860371 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854176 2580 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 19:18:12.860371 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854178 2580 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 19:18:12.860371 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854182 2580 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 19:18:12.860371 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854184 2580 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 19:18:12.860371 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854200 2580 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 19:18:12.860371 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854204 2580 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 19:18:12.860371 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854207 2580 flags.go:64] FLAG: --system-cgroups="" Apr 16 19:18:12.860371 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854210 2580 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 19:18:12.860371 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854215 2580 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 19:18:12.860371 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854218 2580 flags.go:64] FLAG: --tls-cert-file="" Apr 16 19:18:12.860371 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854221 2580 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 19:18:12.860371 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854230 2580 flags.go:64] FLAG: --tls-min-version="" Apr 16 19:18:12.860371 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854233 2580 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 19:18:12.860371 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854236 2580 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 19:18:12.860371 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854239 2580 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 19:18:12.860371 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854242 2580 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 19:18:12.860371 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854245 2580 flags.go:64] FLAG: --v="2" Apr 16 19:18:12.860371 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854249 2580 flags.go:64] FLAG: --version="false" Apr 16 19:18:12.860371 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854254 2580 flags.go:64] FLAG: --vmodule="" Apr 16 19:18:12.860371 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854258 2580 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 19:18:12.860371 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854261 2580 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 19:18:12.860371 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854391 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:18:12.860976 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854395 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:18:12.860976 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854400 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:18:12.860976 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854403 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:18:12.860976 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854406 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:18:12.860976 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854409 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:18:12.860976 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854413 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:18:12.860976 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854417 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:18:12.860976 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854420 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:18:12.860976 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854423 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:18:12.860976 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854431 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:18:12.860976 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854434 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:18:12.860976 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854437 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:18:12.860976 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854440 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:18:12.860976 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854443 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:18:12.860976 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854446 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:18:12.860976 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854449 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:18:12.860976 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854451 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:18:12.860976 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854454 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:18:12.860976 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854456 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:18:12.861492 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854459 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:18:12.861492 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854461 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:18:12.861492 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854464 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:18:12.861492 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854467 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:18:12.861492 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854469 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:18:12.861492 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854472 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:18:12.861492 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854474 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:18:12.861492 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854477 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:18:12.861492 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854479 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:18:12.861492 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854482 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:18:12.861492 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854484 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:18:12.861492 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854487 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:18:12.861492 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854489 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:18:12.861492 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854492 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:18:12.861492 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854496 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:18:12.861492 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854499 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:18:12.861492 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854501 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:18:12.861492 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854504 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:18:12.861492 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854506 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:18:12.861492 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854509 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:18:12.862020 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854511 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:18:12.862020 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854514 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:18:12.862020 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854516 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:18:12.862020 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854525 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:18:12.862020 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854528 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:18:12.862020 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854530 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:18:12.862020 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854533 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:18:12.862020 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854536 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:18:12.862020 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854539 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:18:12.862020 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854541 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:18:12.862020 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854544 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:18:12.862020 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854547 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:18:12.862020 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854549 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:18:12.862020 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854552 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:18:12.862020 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854554 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:18:12.862020 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854557 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:18:12.862020 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854559 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:18:12.862020 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854562 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:18:12.862020 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854564 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:18:12.862020 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854567 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:18:12.862533 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854569 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:18:12.862533 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854572 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:18:12.862533 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854574 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:18:12.862533 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854577 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:18:12.862533 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854579 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:18:12.862533 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854582 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:18:12.862533 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854586 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:18:12.862533 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854588 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:18:12.862533 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854591 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:18:12.862533 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854593 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:18:12.862533 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854596 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:18:12.862533 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854598 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:18:12.862533 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854601 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:18:12.862533 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854603 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:18:12.862533 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854606 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:18:12.862533 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854608 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:18:12.862533 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854617 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:18:12.862533 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854620 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:18:12.862533 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854624 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:18:12.862998 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854627 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:18:12.862998 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854630 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:18:12.862998 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854633 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:18:12.862998 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854635 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:18:12.862998 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854638 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:18:12.862998 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854641 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:18:12.862998 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.854644 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:18:12.862998 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.854649 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:18:12.862998 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.861197 2580 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 19:18:12.862998 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.861215 2580 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 19:18:12.862998 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861268 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:18:12.862998 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861273 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:18:12.862998 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861276 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:18:12.862998 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861279 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:18:12.862998 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861283 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:18:12.863396 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861286 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:18:12.863396 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861289 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:18:12.863396 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861292 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:18:12.863396 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861294 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:18:12.863396 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861298 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:18:12.863396 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861301 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:18:12.863396 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861303 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:18:12.863396 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861306 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:18:12.863396 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861308 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:18:12.863396 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861311 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:18:12.863396 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861314 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:18:12.863396 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861316 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:18:12.863396 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861319 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:18:12.863396 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861322 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:18:12.863396 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861324 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:18:12.863396 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861327 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:18:12.863396 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861330 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:18:12.863396 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861333 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:18:12.863396 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861335 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:18:12.863396 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861338 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:18:12.863868 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861341 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:18:12.863868 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861344 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:18:12.863868 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861346 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:18:12.863868 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861349 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:18:12.863868 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861351 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:18:12.863868 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861354 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:18:12.863868 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861364 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:18:12.863868 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861368 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:18:12.863868 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861370 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:18:12.863868 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861373 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:18:12.863868 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861375 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:18:12.863868 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861378 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:18:12.863868 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861382 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:18:12.863868 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861387 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:18:12.863868 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861390 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:18:12.863868 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861393 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:18:12.863868 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861396 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:18:12.863868 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861399 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:18:12.863868 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861401 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:18:12.863868 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861405 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:18:12.864374 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861407 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:18:12.864374 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861410 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:18:12.864374 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861413 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:18:12.864374 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861416 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:18:12.864374 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861418 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:18:12.864374 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861421 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:18:12.864374 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861424 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:18:12.864374 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861426 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:18:12.864374 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861429 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:18:12.864374 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861431 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:18:12.864374 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861434 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:18:12.864374 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861438 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:18:12.864374 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861441 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:18:12.864374 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861443 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:18:12.864374 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861447 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:18:12.864374 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861449 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:18:12.864374 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861452 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:18:12.864374 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861454 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:18:12.864374 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861458 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:18:12.864374 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861462 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:18:12.864895 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861464 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:18:12.864895 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861467 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:18:12.864895 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861469 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:18:12.864895 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861472 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:18:12.864895 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861474 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:18:12.864895 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861477 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:18:12.864895 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861480 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:18:12.864895 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861482 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:18:12.864895 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861484 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:18:12.864895 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861487 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:18:12.864895 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861490 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:18:12.864895 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861492 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:18:12.864895 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861495 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:18:12.864895 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861498 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:18:12.864895 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861500 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:18:12.864895 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861503 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:18:12.864895 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861506 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:18:12.864895 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861509 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:18:12.864895 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861512 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:18:12.864895 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861514 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:18:12.865383 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861517 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:18:12.865383 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.861522 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:18:12.865383 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861629 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:18:12.865383 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861634 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:18:12.865383 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861638 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:18:12.865383 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861641 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:18:12.865383 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861643 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:18:12.865383 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861647 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:18:12.865383 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861650 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:18:12.865383 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861655 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:18:12.865383 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861658 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:18:12.865383 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861661 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:18:12.865383 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861664 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:18:12.865383 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861666 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:18:12.865383 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861669 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:18:12.865761 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861671 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:18:12.865761 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861674 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:18:12.865761 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861676 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:18:12.865761 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861679 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:18:12.865761 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861681 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:18:12.865761 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861684 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:18:12.865761 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861686 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:18:12.865761 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861689 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:18:12.865761 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861691 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:18:12.865761 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861694 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:18:12.865761 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861696 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:18:12.865761 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861699 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:18:12.865761 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861701 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:18:12.865761 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861704 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:18:12.865761 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861706 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:18:12.865761 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861709 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:18:12.865761 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861711 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:18:12.865761 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861714 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:18:12.865761 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861716 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:18:12.865761 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861720 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:18:12.866266 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861724 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:18:12.866266 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861727 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:18:12.866266 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861730 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:18:12.866266 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861732 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:18:12.866266 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861735 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:18:12.866266 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861737 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:18:12.866266 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861740 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:18:12.866266 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861742 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:18:12.866266 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861745 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:18:12.866266 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861748 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:18:12.866266 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861751 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:18:12.866266 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861753 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:18:12.866266 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861755 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:18:12.866266 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861758 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:18:12.866266 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861761 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:18:12.866266 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861763 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:18:12.866266 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861766 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:18:12.866266 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861768 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:18:12.866266 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861771 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:18:12.866266 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861773 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:18:12.866769 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861776 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:18:12.866769 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861779 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:18:12.866769 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861781 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:18:12.866769 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861783 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:18:12.866769 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861786 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:18:12.866769 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861788 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:18:12.866769 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861791 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:18:12.866769 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861793 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:18:12.866769 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861796 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:18:12.866769 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861798 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:18:12.866769 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861801 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:18:12.866769 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861804 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:18:12.866769 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861806 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:18:12.866769 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861809 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:18:12.866769 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861811 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:18:12.866769 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861814 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:18:12.866769 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861816 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:18:12.866769 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861819 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:18:12.866769 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861822 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:18:12.866769 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861824 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:18:12.867277 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861827 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:18:12.867277 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861829 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:18:12.867277 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861832 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:18:12.867277 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861835 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:18:12.867277 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861837 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:18:12.867277 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861840 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:18:12.867277 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861842 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:18:12.867277 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861845 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:18:12.867277 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861848 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:18:12.867277 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861850 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:18:12.867277 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861852 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:18:12.867277 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861855 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:18:12.867277 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:12.861858 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:18:12.867277 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.861863 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:18:12.867277 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.862643 2580 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 19:18:12.867653 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.864802 2580 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 19:18:12.867653 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.865789 2580 server.go:1019] "Starting client certificate rotation" Apr 16 19:18:12.867653 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.865886 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 19:18:12.867653 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.865928 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 19:18:12.891593 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.891569 2580 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 19:18:12.895205 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.895165 2580 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 19:18:12.909141 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.909117 2580 log.go:25] "Validated CRI v1 runtime API" Apr 16 19:18:12.915958 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.915938 2580 log.go:25] "Validated CRI v1 image API" Apr 16 19:18:12.917224 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.917209 2580 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 19:18:12.921789 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.921756 2580 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 af698161-a978-47eb-91fe-571f0637b685:/dev/nvme0n1p3 db9b6636-f8df-4ebb-89f2-ba833187b19e:/dev/nvme0n1p4] Apr 16 19:18:12.921866 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.921786 2580 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 19:18:12.924199 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.924173 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 19:18:12.927984 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.927865 2580 manager.go:217] Machine: {Timestamp:2026-04-16 19:18:12.925780745 +0000 UTC m=+0.412978569 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100220 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec23be0cfac81c2c8b4156960969c8a0 SystemUUID:ec23be0c-fac8-1c2c-8b41-56960969c8a0 BootID:30c11a5b-afb9-4fb2-9ed2-5405bbccf7fe Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:f3:8c:90:eb:49 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:f3:8c:90:eb:49 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:5a:f0:e7:8b:5f:2b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 19:18:12.927984 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.927979 2580 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 19:18:12.928120 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.928108 2580 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 19:18:12.928461 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.928435 2580 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 19:18:12.928633 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.928463 2580 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-83.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 19:18:12.928678 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.928643 2580 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 19:18:12.928678 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.928651 2580 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 19:18:12.928678 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.928665 2580 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 19:18:12.928763 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.928678 2580 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 19:18:12.929557 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.929546 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 16 19:18:12.929660 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.929652 2580 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 19:18:12.932149 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.932139 2580 kubelet.go:491] "Attempting to sync node with API server" Apr 16 19:18:12.932201 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.932153 2580 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 19:18:12.932201 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.932165 2580 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 19:18:12.932201 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.932176 2580 kubelet.go:397] "Adding apiserver pod source" Apr 16 19:18:12.932201 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.932184 2580 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 19:18:12.933217 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.933202 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 19:18:12.933302 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.933223 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 19:18:12.936590 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.936573 2580 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 19:18:12.938103 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.938087 2580 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 19:18:12.940522 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.940506 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 19:18:12.940601 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.940528 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 19:18:12.940601 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.940536 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 19:18:12.940601 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.940545 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 19:18:12.940601 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.940553 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 19:18:12.940601 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.940562 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 19:18:12.940601 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.940571 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 19:18:12.940601 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.940580 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 19:18:12.940601 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.940590 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 19:18:12.940601 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.940600 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 19:18:12.940884 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.940613 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 19:18:12.940884 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.940628 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 19:18:12.944490 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.944471 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 19:18:12.944618 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.944606 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 19:18:12.948179 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.948139 2580 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-83.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 19:18:12.948276 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:12.948177 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 19:18:12.948276 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:12.948246 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-83.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 19:18:12.949012 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.948998 2580 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 19:18:12.949051 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.949047 2580 server.go:1295] "Started kubelet" Apr 16 19:18:12.949159 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.949116 2580 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 19:18:12.949239 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.949167 2580 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 19:18:12.949299 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.949264 2580 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 19:18:12.949974 ip-10-0-130-83 systemd[1]: Started Kubernetes Kubelet. Apr 16 19:18:12.951142 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.951112 2580 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 19:18:12.951720 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.951701 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-6dbgq" Apr 16 19:18:12.952394 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.952380 2580 server.go:317] "Adding debug handlers to kubelet server" Apr 16 19:18:12.956845 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.956825 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 19:18:12.957534 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.957518 2580 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 19:18:12.957642 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:12.956520 2580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-83.ec2.internal.18a6ec759deb503c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-83.ec2.internal,UID:ip-10-0-130-83.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-130-83.ec2.internal,},FirstTimestamp:2026-04-16 19:18:12.94901254 +0000 UTC m=+0.436210363,LastTimestamp:2026-04-16 19:18:12.94901254 +0000 UTC m=+0.436210363,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-83.ec2.internal,}" Apr 16 19:18:12.958322 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.958302 2580 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 19:18:12.958322 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.958325 2580 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 19:18:12.958469 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.958352 2580 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 19:18:12.958469 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.958398 2580 reconstruct.go:97] "Volume reconstruction finished" Apr 16 19:18:12.958469 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.958406 2580 reconciler.go:26] "Reconciler: start to sync state" Apr 16 19:18:12.958602 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.958514 2580 factory.go:55] Registering systemd factory Apr 16 19:18:12.958602 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.958535 2580 factory.go:223] Registration of the systemd container factory successfully Apr 16 19:18:12.958848 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.958830 2580 factory.go:153] Registering CRI-O factory Apr 16 19:18:12.958848 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.958846 2580 factory.go:223] Registration of the crio container factory successfully Apr 16 19:18:12.958959 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.958897 2580 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 19:18:12.958959 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.958922 2580 factory.go:103] Registering Raw factory Apr 16 19:18:12.958959 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.958936 2580 manager.go:1196] Started watching for new ooms in manager Apr 16 19:18:12.958959 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:12.958944 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-83.ec2.internal\" not found" Apr 16 19:18:12.959411 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.959390 2580 manager.go:319] Starting recovery of all containers Apr 16 19:18:12.960717 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:12.960699 2580 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 19:18:12.962777 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.962750 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-6dbgq" Apr 16 19:18:12.966020 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:12.965979 2580 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-130-83.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 19:18:12.966293 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:12.966248 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 19:18:12.969146 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.969115 2580 manager.go:324] Recovery completed Apr 16 19:18:12.975177 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.975165 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:18:12.977881 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.977864 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-83.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:18:12.977978 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.977891 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-83.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:18:12.977978 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.977901 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-83.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:18:12.978464 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.978451 2580 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 19:18:12.978464 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.978461 2580 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 19:18:12.978571 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.978477 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 16 19:18:12.980971 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.980960 2580 policy_none.go:49] "None policy: Start" Apr 16 19:18:12.981021 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.980978 2580 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 19:18:12.981021 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:12.980988 2580 state_mem.go:35] "Initializing new in-memory state store" Apr 16 19:18:13.025179 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.017800 2580 manager.go:341] "Starting Device Plugin manager" Apr 16 19:18:13.025179 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:13.017896 2580 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 19:18:13.025179 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.017909 2580 server.go:85] "Starting device plugin registration server" Apr 16 19:18:13.025179 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.018212 2580 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 19:18:13.025179 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.018225 2580 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 19:18:13.025179 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.018323 2580 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 19:18:13.025179 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.018420 2580 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 19:18:13.025179 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.018432 2580 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 19:18:13.025179 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:13.019216 2580 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 19:18:13.025179 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:13.019250 2580 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-83.ec2.internal\" not found" Apr 16 19:18:13.080298 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.080264 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 19:18:13.081430 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.081404 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 19:18:13.081430 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.081435 2580 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 19:18:13.081592 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.081452 2580 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 19:18:13.081592 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.081459 2580 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 19:18:13.081592 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:13.081537 2580 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 19:18:13.083487 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.083467 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:18:13.119283 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.119183 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:18:13.120710 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.120692 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-83.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:18:13.120813 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.120724 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-83.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:18:13.120813 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.120734 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-83.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:18:13.120813 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.120759 2580 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-83.ec2.internal" Apr 16 19:18:13.128812 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.128792 2580 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-83.ec2.internal" Apr 16 19:18:13.128881 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:13.128820 2580 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-83.ec2.internal\": node \"ip-10-0-130-83.ec2.internal\" not found" Apr 16 19:18:13.146300 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:13.146273 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-83.ec2.internal\" not found" Apr 16 19:18:13.181668 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.181636 2580 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-130-83.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-83.ec2.internal"] Apr 16 19:18:13.181807 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.181706 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:18:13.183425 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.183406 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-83.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:18:13.183514 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.183438 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-83.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:18:13.183514 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.183447 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-83.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:18:13.184926 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.184915 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:18:13.185084 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.185069 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-83.ec2.internal" Apr 16 19:18:13.185131 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.185098 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:18:13.185707 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.185684 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-83.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:18:13.185707 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.185702 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-83.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:18:13.185845 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.185714 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-83.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:18:13.185845 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.185721 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-83.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:18:13.185845 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.185730 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-83.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:18:13.185845 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.185731 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-83.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:18:13.186969 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.186952 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-83.ec2.internal" Apr 16 19:18:13.187040 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.186982 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:18:13.187626 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.187613 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-83.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:18:13.187700 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.187638 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-83.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:18:13.187700 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.187648 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-83.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:18:13.215822 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:13.215799 2580 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-83.ec2.internal\" not found" node="ip-10-0-130-83.ec2.internal" Apr 16 19:18:13.220417 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:13.220399 2580 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-83.ec2.internal\" not found" node="ip-10-0-130-83.ec2.internal" Apr 16 19:18:13.246461 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:13.246434 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-83.ec2.internal\" not found" Apr 16 19:18:13.260968 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.260943 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/189f219007fb4c6d42f85db8190298b0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-83.ec2.internal\" (UID: \"189f219007fb4c6d42f85db8190298b0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-83.ec2.internal" Apr 16 19:18:13.261071 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.260969 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/058aac4aaec6cd6ac195597fdcd3d1b7-config\") pod \"kube-apiserver-proxy-ip-10-0-130-83.ec2.internal\" (UID: \"058aac4aaec6cd6ac195597fdcd3d1b7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-83.ec2.internal" Apr 16 19:18:13.261071 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.260991 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/189f219007fb4c6d42f85db8190298b0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-83.ec2.internal\" (UID: \"189f219007fb4c6d42f85db8190298b0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-83.ec2.internal" Apr 16 19:18:13.347528 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:13.347482 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-83.ec2.internal\" not found" Apr 16 19:18:13.361870 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.361838 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/058aac4aaec6cd6ac195597fdcd3d1b7-config\") pod \"kube-apiserver-proxy-ip-10-0-130-83.ec2.internal\" (UID: \"058aac4aaec6cd6ac195597fdcd3d1b7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-83.ec2.internal" Apr 16 19:18:13.361983 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.361912 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/189f219007fb4c6d42f85db8190298b0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-83.ec2.internal\" (UID: \"189f219007fb4c6d42f85db8190298b0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-83.ec2.internal" Apr 16 19:18:13.361983 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.361848 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/058aac4aaec6cd6ac195597fdcd3d1b7-config\") pod \"kube-apiserver-proxy-ip-10-0-130-83.ec2.internal\" (UID: \"058aac4aaec6cd6ac195597fdcd3d1b7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-83.ec2.internal" Apr 16 19:18:13.361983 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.361938 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/189f219007fb4c6d42f85db8190298b0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-83.ec2.internal\" (UID: \"189f219007fb4c6d42f85db8190298b0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-83.ec2.internal" Apr 16 19:18:13.362101 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.361984 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/189f219007fb4c6d42f85db8190298b0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-83.ec2.internal\" (UID: \"189f219007fb4c6d42f85db8190298b0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-83.ec2.internal" Apr 16 19:18:13.362101 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.362004 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/189f219007fb4c6d42f85db8190298b0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-83.ec2.internal\" (UID: \"189f219007fb4c6d42f85db8190298b0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-83.ec2.internal" Apr 16 19:18:13.448288 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:13.448260 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-83.ec2.internal\" not found" Apr 16 19:18:13.517771 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.517742 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-83.ec2.internal" Apr 16 19:18:13.523319 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.523301 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-83.ec2.internal" Apr 16 19:18:13.549221 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:13.549176 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-83.ec2.internal\" not found" Apr 16 19:18:13.649745 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:13.649700 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-83.ec2.internal\" not found" Apr 16 19:18:13.750269 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:13.750164 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-83.ec2.internal\" not found" Apr 16 19:18:13.850852 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:13.850810 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-83.ec2.internal\" not found" Apr 16 19:18:13.865233 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.865211 2580 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 19:18:13.865366 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.865351 2580 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 19:18:13.951501 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:13.951470 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-83.ec2.internal\" not found" Apr 16 19:18:13.957729 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.957708 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 19:18:13.965475 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.965443 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 19:13:12 +0000 UTC" deadline="2027-10-15 07:32:09.219701261 +0000 UTC" Apr 16 19:18:13.965475 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.965474 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13116h13m55.254231465s" Apr 16 19:18:13.968965 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.968946 2580 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:18:13.969373 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.969358 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 19:18:13.983543 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:13.983388 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod058aac4aaec6cd6ac195597fdcd3d1b7.slice/crio-7b1c3f08b1bfee6a923e5b8fee60d634f2d8a57183f226021f41979605969b6d WatchSource:0}: Error finding container 7b1c3f08b1bfee6a923e5b8fee60d634f2d8a57183f226021f41979605969b6d: Status 404 returned error can't find the container with id 7b1c3f08b1bfee6a923e5b8fee60d634f2d8a57183f226021f41979605969b6d Apr 16 19:18:13.983755 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:13.983732 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod189f219007fb4c6d42f85db8190298b0.slice/crio-91fdb81a150e82d12da5409165a25793db72d75f8694dd3d1c2f9ba23566c739 WatchSource:0}: Error finding container 91fdb81a150e82d12da5409165a25793db72d75f8694dd3d1c2f9ba23566c739: Status 404 returned error can't find the container with id 91fdb81a150e82d12da5409165a25793db72d75f8694dd3d1c2f9ba23566c739 Apr 16 19:18:13.988249 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.988235 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:18:13.993130 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:13.993113 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-pq6m9" Apr 16 19:18:14.000867 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.000814 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-pq6m9" Apr 16 19:18:14.052097 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:14.052059 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-83.ec2.internal\" not found" Apr 16 19:18:14.085875 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.085812 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-83.ec2.internal" event={"ID":"189f219007fb4c6d42f85db8190298b0","Type":"ContainerStarted","Data":"91fdb81a150e82d12da5409165a25793db72d75f8694dd3d1c2f9ba23566c739"} Apr 16 19:18:14.086778 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.086753 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-83.ec2.internal" event={"ID":"058aac4aaec6cd6ac195597fdcd3d1b7","Type":"ContainerStarted","Data":"7b1c3f08b1bfee6a923e5b8fee60d634f2d8a57183f226021f41979605969b6d"} Apr 16 19:18:14.152980 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:14.152948 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-83.ec2.internal\" not found" Apr 16 19:18:14.253591 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:14.253519 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-83.ec2.internal\" not found" Apr 16 19:18:14.354157 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:14.354122 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-83.ec2.internal\" not found" Apr 16 19:18:14.355484 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.355468 2580 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:18:14.455027 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:14.454991 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-83.ec2.internal\" not found" Apr 16 19:18:14.481790 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.481760 2580 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:18:14.559232 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.558953 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-83.ec2.internal" Apr 16 19:18:14.572182 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.572049 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 19:18:14.576037 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.573175 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-83.ec2.internal" Apr 16 19:18:14.591571 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.591425 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 19:18:14.933020 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.932938 2580 apiserver.go:52] "Watching apiserver" Apr 16 19:18:14.941515 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.941489 2580 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 19:18:14.942648 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.942623 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-fvvfr","openshift-ovn-kubernetes/ovnkube-node-q4qdv","kube-system/konnectivity-agent-5zbnv","kube-system/kube-apiserver-proxy-ip-10-0-130-83.ec2.internal","openshift-cluster-node-tuning-operator/tuned-jfm84","openshift-dns/node-resolver-8khcf","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-83.ec2.internal","openshift-network-diagnostics/network-check-target-zsg8r","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d62pc","openshift-image-registry/node-ca-zmn8r","openshift-multus/multus-additional-cni-plugins-9pgnh","openshift-multus/multus-pjfwp","openshift-multus/network-metrics-daemon-hj7p9"] Apr 16 19:18:14.944110 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.944085 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pjfwp" Apr 16 19:18:14.945468 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.945429 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:14.946537 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.946510 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 19:18:14.946537 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.946526 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 19:18:14.946686 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.946542 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 19:18:14.946686 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.946569 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-fdrl6\"" Apr 16 19:18:14.946686 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.946654 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-5zbnv" Apr 16 19:18:14.946846 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.946664 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 19:18:14.947892 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.947811 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 19:18:14.947892 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.947888 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 19:18:14.948051 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.947936 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 19:18:14.948363 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.948121 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 19:18:14.948363 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.948151 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-sh9hm\"" Apr 16 19:18:14.949667 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.948962 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 19:18:14.949667 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.949000 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 19:18:14.949667 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.949208 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-fvvfr" Apr 16 19:18:14.949667 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.949335 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 19:18:14.949667 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.949442 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 19:18:14.949667 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.949593 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-gcn7q\"" Apr 16 19:18:14.950674 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.950654 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:14.950768 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.950734 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8khcf" Apr 16 19:18:14.951601 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.951583 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-dmmlb\"" Apr 16 19:18:14.951931 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.951913 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 19:18:14.952014 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.951929 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 19:18:14.952083 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.952064 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hj7p9" Apr 16 19:18:14.952247 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:14.952219 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hj7p9" podUID="c79c97f4-34fe-4b2b-9f22-401688c77d79" Apr 16 19:18:14.952717 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.952698 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:18:14.953053 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.952962 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 19:18:14.953053 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.952982 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 19:18:14.953053 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.953014 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-jrghd\"" Apr 16 19:18:14.953275 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.953057 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-dt27d\"" Apr 16 19:18:14.953426 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.953407 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 19:18:14.953538 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.953407 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:18:14.953896 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.953743 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsg8r" Apr 16 19:18:14.953896 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:14.953848 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsg8r" podUID="37161dea-24de-48c4-8bf3-490c4f209803" Apr 16 19:18:14.955886 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.955857 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d62pc" Apr 16 19:18:14.959059 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.958515 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 19:18:14.959059 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.958547 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zmn8r" Apr 16 19:18:14.959059 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.958630 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 19:18:14.959406 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.959386 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 19:18:14.959816 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.959797 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-qxzft\"" Apr 16 19:18:14.960694 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.960676 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9pgnh" Apr 16 19:18:14.962235 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.962216 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 19:18:14.963000 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.962955 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-fsnql\"" Apr 16 19:18:14.963000 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.962974 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 19:18:14.963149 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.962975 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 19:18:14.963499 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.963478 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 19:18:14.963592 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.963502 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-bc9sd\"" Apr 16 19:18:14.963700 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.963674 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 19:18:14.973123 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.973080 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:14.973249 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.973139 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f26e651f-cca2-4b49-b9c1-af63f23ad901-agent-certs\") pod \"konnectivity-agent-5zbnv\" (UID: \"f26e651f-cca2-4b49-b9c1-af63f23ad901\") " pod="kube-system/konnectivity-agent-5zbnv" Apr 16 19:18:14.973249 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.973226 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1127c8b7-d0b3-4fc3-8097-3548deff71b5-socket-dir\") pod \"aws-ebs-csi-driver-node-d62pc\" (UID: \"1127c8b7-d0b3-4fc3-8097-3548deff71b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d62pc" Apr 16 19:18:14.973359 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.973256 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23357efe-6fd5-4b07-ba55-0725ab46c062-etc-kubernetes\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:14.973359 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.973280 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-var-lib-openvswitch\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:14.973359 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.973316 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-host-cni-netd\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:14.973359 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.973346 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swlwr\" (UniqueName: \"kubernetes.io/projected/0244c6e9-6611-4147-8e78-0345faffa52e-kube-api-access-swlwr\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:14.973542 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.973371 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f26e651f-cca2-4b49-b9c1-af63f23ad901-konnectivity-ca\") pod \"konnectivity-agent-5zbnv\" (UID: \"f26e651f-cca2-4b49-b9c1-af63f23ad901\") " pod="kube-system/konnectivity-agent-5zbnv" Apr 16 19:18:14.973542 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.973397 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-multus-socket-dir-parent\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:14.973542 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.973423 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-host-run-k8s-cni-cncf-io\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:14.973542 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.973447 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/23357efe-6fd5-4b07-ba55-0725ab46c062-etc-sysctl-conf\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:14.973542 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.973471 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/23357efe-6fd5-4b07-ba55-0725ab46c062-run\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:14.973542 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.973515 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23357efe-6fd5-4b07-ba55-0725ab46c062-host\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:14.973816 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.973565 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-cnibin\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:14.973816 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.973591 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-etc-kubernetes\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:14.973816 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.973614 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/23357efe-6fd5-4b07-ba55-0725ab46c062-tmp\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:14.973816 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.973637 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/30909a2b-a27c-4b44-8a1b-c23e90999d15-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9pgnh\" (UID: \"30909a2b-a27c-4b44-8a1b-c23e90999d15\") " pod="openshift-multus/multus-additional-cni-plugins-9pgnh" Apr 16 19:18:14.973816 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.973658 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0244c6e9-6611-4147-8e78-0345faffa52e-ovnkube-config\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:14.973816 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.973687 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-host-var-lib-cni-bin\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:14.973816 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.973709 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/030c4af3-2776-4d57-94f4-7fb7b885e5e4-multus-daemon-config\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:14.973816 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.973752 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/23357efe-6fd5-4b07-ba55-0725ab46c062-etc-sysctl-d\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:14.973816 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.973785 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/575af952-88cc-43f1-b1c1-1f28a0549971-iptables-alerter-script\") pod \"iptables-alerter-fvvfr\" (UID: \"575af952-88cc-43f1-b1c1-1f28a0549971\") " pod="openshift-network-operator/iptables-alerter-fvvfr" Apr 16 19:18:14.973816 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.973812 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1127c8b7-d0b3-4fc3-8097-3548deff71b5-registration-dir\") pod \"aws-ebs-csi-driver-node-d62pc\" (UID: \"1127c8b7-d0b3-4fc3-8097-3548deff71b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d62pc" Apr 16 19:18:14.974315 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.973835 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/23357efe-6fd5-4b07-ba55-0725ab46c062-sys\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:14.974315 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.973856 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/23357efe-6fd5-4b07-ba55-0725ab46c062-var-lib-kubelet\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:14.974315 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.973875 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gqhc\" (UniqueName: \"kubernetes.io/projected/23357efe-6fd5-4b07-ba55-0725ab46c062-kube-api-access-7gqhc\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:14.974315 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.973906 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fad29fdb-1d79-448d-b40e-0652c1dcf698-hosts-file\") pod \"node-resolver-8khcf\" (UID: \"fad29fdb-1d79-448d-b40e-0652c1dcf698\") " pod="openshift-dns/node-resolver-8khcf" Apr 16 19:18:14.974315 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.973957 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxx7r\" (UniqueName: \"kubernetes.io/projected/30909a2b-a27c-4b44-8a1b-c23e90999d15-kube-api-access-bxx7r\") pod \"multus-additional-cni-plugins-9pgnh\" (UID: \"30909a2b-a27c-4b44-8a1b-c23e90999d15\") " pod="openshift-multus/multus-additional-cni-plugins-9pgnh" Apr 16 19:18:14.974315 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.974015 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-host-slash\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:14.974315 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.974044 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0244c6e9-6611-4147-8e78-0345faffa52e-ovnkube-script-lib\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:14.974315 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.974069 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1127c8b7-d0b3-4fc3-8097-3548deff71b5-etc-selinux\") pod \"aws-ebs-csi-driver-node-d62pc\" (UID: \"1127c8b7-d0b3-4fc3-8097-3548deff71b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d62pc" Apr 16 19:18:14.974315 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.974092 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/23357efe-6fd5-4b07-ba55-0725ab46c062-etc-sysconfig\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:14.974315 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.974122 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-run-systemd\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:14.974315 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.974146 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pf6t\" (UniqueName: \"kubernetes.io/projected/575af952-88cc-43f1-b1c1-1f28a0549971-kube-api-access-9pf6t\") pod \"iptables-alerter-fvvfr\" (UID: \"575af952-88cc-43f1-b1c1-1f28a0549971\") " pod="openshift-network-operator/iptables-alerter-fvvfr" Apr 16 19:18:14.974315 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.974169 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1127c8b7-d0b3-4fc3-8097-3548deff71b5-device-dir\") pod \"aws-ebs-csi-driver-node-d62pc\" (UID: \"1127c8b7-d0b3-4fc3-8097-3548deff71b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d62pc" Apr 16 19:18:14.974315 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.974211 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-host-run-netns\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:14.974315 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.974234 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-multus-conf-dir\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:14.974315 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.974257 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/23357efe-6fd5-4b07-ba55-0725ab46c062-etc-modprobe-d\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:14.974315 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.974279 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/23357efe-6fd5-4b07-ba55-0725ab46c062-lib-modules\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:14.974987 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.974315 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/30909a2b-a27c-4b44-8a1b-c23e90999d15-cni-binary-copy\") pod \"multus-additional-cni-plugins-9pgnh\" (UID: \"30909a2b-a27c-4b44-8a1b-c23e90999d15\") " pod="openshift-multus/multus-additional-cni-plugins-9pgnh" Apr 16 19:18:14.974987 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.974361 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8a90f765-2c10-429d-99f6-bbcf7122c7a0-serviceca\") pod \"node-ca-zmn8r\" (UID: \"8a90f765-2c10-429d-99f6-bbcf7122c7a0\") " pod="openshift-image-registry/node-ca-zmn8r" Apr 16 19:18:14.974987 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.974389 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-etc-openvswitch\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:14.974987 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.974415 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1127c8b7-d0b3-4fc3-8097-3548deff71b5-sys-fs\") pod \"aws-ebs-csi-driver-node-d62pc\" (UID: \"1127c8b7-d0b3-4fc3-8097-3548deff71b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d62pc" Apr 16 19:18:14.974987 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.974440 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fad29fdb-1d79-448d-b40e-0652c1dcf698-tmp-dir\") pod \"node-resolver-8khcf\" (UID: \"fad29fdb-1d79-448d-b40e-0652c1dcf698\") " pod="openshift-dns/node-resolver-8khcf" Apr 16 19:18:14.974987 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.974464 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/30909a2b-a27c-4b44-8a1b-c23e90999d15-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9pgnh\" (UID: \"30909a2b-a27c-4b44-8a1b-c23e90999d15\") " pod="openshift-multus/multus-additional-cni-plugins-9pgnh" Apr 16 19:18:14.974987 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.974484 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-host-run-ovn-kubernetes\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:14.974987 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.974504 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs5qt\" (UniqueName: \"kubernetes.io/projected/1127c8b7-d0b3-4fc3-8097-3548deff71b5-kube-api-access-cs5qt\") pod \"aws-ebs-csi-driver-node-d62pc\" (UID: \"1127c8b7-d0b3-4fc3-8097-3548deff71b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d62pc" Apr 16 19:18:14.974987 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.974522 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-hostroot\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:14.974987 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.974544 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/30909a2b-a27c-4b44-8a1b-c23e90999d15-cnibin\") pod \"multus-additional-cni-plugins-9pgnh\" (UID: \"30909a2b-a27c-4b44-8a1b-c23e90999d15\") " pod="openshift-multus/multus-additional-cni-plugins-9pgnh" Apr 16 19:18:14.974987 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.974587 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/30909a2b-a27c-4b44-8a1b-c23e90999d15-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9pgnh\" (UID: \"30909a2b-a27c-4b44-8a1b-c23e90999d15\") " pod="openshift-multus/multus-additional-cni-plugins-9pgnh" Apr 16 19:18:14.974987 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.974633 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0244c6e9-6611-4147-8e78-0345faffa52e-ovn-node-metrics-cert\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:14.974987 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.974668 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c79c97f4-34fe-4b2b-9f22-401688c77d79-metrics-certs\") pod \"network-metrics-daemon-hj7p9\" (UID: \"c79c97f4-34fe-4b2b-9f22-401688c77d79\") " pod="openshift-multus/network-metrics-daemon-hj7p9" Apr 16 19:18:14.974987 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.974694 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pv9b\" (UniqueName: \"kubernetes.io/projected/37161dea-24de-48c4-8bf3-490c4f209803-kube-api-access-7pv9b\") pod \"network-check-target-zsg8r\" (UID: \"37161dea-24de-48c4-8bf3-490c4f209803\") " pod="openshift-network-diagnostics/network-check-target-zsg8r" Apr 16 19:18:14.974987 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.974718 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a90f765-2c10-429d-99f6-bbcf7122c7a0-host\") pod \"node-ca-zmn8r\" (UID: \"8a90f765-2c10-429d-99f6-bbcf7122c7a0\") " pod="openshift-image-registry/node-ca-zmn8r" Apr 16 19:18:14.974987 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.974740 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-host-run-multus-certs\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:14.975637 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.974771 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nwt2\" (UniqueName: \"kubernetes.io/projected/fad29fdb-1d79-448d-b40e-0652c1dcf698-kube-api-access-8nwt2\") pod \"node-resolver-8khcf\" (UID: \"fad29fdb-1d79-448d-b40e-0652c1dcf698\") " pod="openshift-dns/node-resolver-8khcf" Apr 16 19:18:14.975637 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.974795 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-host-run-netns\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:14.975637 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.974819 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-node-log\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:14.975637 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.974843 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x58ts\" (UniqueName: \"kubernetes.io/projected/c79c97f4-34fe-4b2b-9f22-401688c77d79-kube-api-access-x58ts\") pod \"network-metrics-daemon-hj7p9\" (UID: \"c79c97f4-34fe-4b2b-9f22-401688c77d79\") " pod="openshift-multus/network-metrics-daemon-hj7p9" Apr 16 19:18:14.975637 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.974883 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-system-cni-dir\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:14.975637 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.974948 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-multus-cni-dir\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:14.975637 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.974972 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/30909a2b-a27c-4b44-8a1b-c23e90999d15-system-cni-dir\") pod \"multus-additional-cni-plugins-9pgnh\" (UID: \"30909a2b-a27c-4b44-8a1b-c23e90999d15\") " pod="openshift-multus/multus-additional-cni-plugins-9pgnh" Apr 16 19:18:14.975637 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.975003 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-run-ovn\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:14.975637 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.975029 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1127c8b7-d0b3-4fc3-8097-3548deff71b5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-d62pc\" (UID: \"1127c8b7-d0b3-4fc3-8097-3548deff71b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d62pc" Apr 16 19:18:14.975637 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.975053 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/030c4af3-2776-4d57-94f4-7fb7b885e5e4-cni-binary-copy\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:14.975637 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.975078 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-host-var-lib-kubelet\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:14.975637 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.975105 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sffq4\" (UniqueName: \"kubernetes.io/projected/030c4af3-2776-4d57-94f4-7fb7b885e5e4-kube-api-access-sffq4\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:14.975637 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.975134 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/23357efe-6fd5-4b07-ba55-0725ab46c062-etc-tuned\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:14.975637 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.975157 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-log-socket\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:14.975637 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.975209 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0244c6e9-6611-4147-8e78-0345faffa52e-env-overrides\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:14.975637 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.975234 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-os-release\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:14.975637 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.975258 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/23357efe-6fd5-4b07-ba55-0725ab46c062-etc-systemd\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:14.976306 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.975282 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/30909a2b-a27c-4b44-8a1b-c23e90999d15-os-release\") pod \"multus-additional-cni-plugins-9pgnh\" (UID: \"30909a2b-a27c-4b44-8a1b-c23e90999d15\") " pod="openshift-multus/multus-additional-cni-plugins-9pgnh" Apr 16 19:18:14.976306 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.975304 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-host-cni-bin\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:14.976306 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.975329 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/575af952-88cc-43f1-b1c1-1f28a0549971-host-slash\") pod \"iptables-alerter-fvvfr\" (UID: \"575af952-88cc-43f1-b1c1-1f28a0549971\") " pod="openshift-network-operator/iptables-alerter-fvvfr" Apr 16 19:18:14.976306 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.975354 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-host-var-lib-cni-multus\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:14.976306 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.975379 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vwcc\" (UniqueName: \"kubernetes.io/projected/8a90f765-2c10-429d-99f6-bbcf7122c7a0-kube-api-access-4vwcc\") pod \"node-ca-zmn8r\" (UID: \"8a90f765-2c10-429d-99f6-bbcf7122c7a0\") " pod="openshift-image-registry/node-ca-zmn8r" Apr 16 19:18:14.976306 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.975404 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-host-kubelet\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:14.976306 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.975434 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-systemd-units\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:14.976306 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:14.975457 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-run-openvswitch\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.002469 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.002440 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 19:13:13 +0000 UTC" deadline="2027-11-15 00:41:31.135936424 +0000 UTC" Apr 16 19:18:15.002469 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.002468 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13853h23m16.133470809s" Apr 16 19:18:15.059682 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.059653 2580 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 19:18:15.076672 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.076643 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0244c6e9-6611-4147-8e78-0345faffa52e-ovnkube-config\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.076672 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.076684 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-host-var-lib-cni-bin\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:15.076943 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.076707 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/030c4af3-2776-4d57-94f4-7fb7b885e5e4-multus-daemon-config\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:15.076943 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.076728 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/23357efe-6fd5-4b07-ba55-0725ab46c062-etc-sysctl-d\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:15.076943 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.076773 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/575af952-88cc-43f1-b1c1-1f28a0549971-iptables-alerter-script\") pod \"iptables-alerter-fvvfr\" (UID: \"575af952-88cc-43f1-b1c1-1f28a0549971\") " pod="openshift-network-operator/iptables-alerter-fvvfr" Apr 16 19:18:15.076943 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.076773 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-host-var-lib-cni-bin\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:15.076943 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.076831 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1127c8b7-d0b3-4fc3-8097-3548deff71b5-registration-dir\") pod \"aws-ebs-csi-driver-node-d62pc\" (UID: \"1127c8b7-d0b3-4fc3-8097-3548deff71b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d62pc" Apr 16 19:18:15.076943 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.076856 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/23357efe-6fd5-4b07-ba55-0725ab46c062-sys\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:15.076943 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.076915 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/23357efe-6fd5-4b07-ba55-0725ab46c062-var-lib-kubelet\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:15.076943 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.076940 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7gqhc\" (UniqueName: \"kubernetes.io/projected/23357efe-6fd5-4b07-ba55-0725ab46c062-kube-api-access-7gqhc\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:15.076943 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.076941 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/23357efe-6fd5-4b07-ba55-0725ab46c062-sys\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:15.077384 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.076951 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/23357efe-6fd5-4b07-ba55-0725ab46c062-etc-sysctl-d\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:15.077384 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.076964 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fad29fdb-1d79-448d-b40e-0652c1dcf698-hosts-file\") pod \"node-resolver-8khcf\" (UID: \"fad29fdb-1d79-448d-b40e-0652c1dcf698\") " pod="openshift-dns/node-resolver-8khcf" Apr 16 19:18:15.077384 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.076951 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1127c8b7-d0b3-4fc3-8097-3548deff71b5-registration-dir\") pod \"aws-ebs-csi-driver-node-d62pc\" (UID: \"1127c8b7-d0b3-4fc3-8097-3548deff71b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d62pc" Apr 16 19:18:15.077384 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.077003 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/23357efe-6fd5-4b07-ba55-0725ab46c062-var-lib-kubelet\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:15.077384 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.077050 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bxx7r\" (UniqueName: \"kubernetes.io/projected/30909a2b-a27c-4b44-8a1b-c23e90999d15-kube-api-access-bxx7r\") pod \"multus-additional-cni-plugins-9pgnh\" (UID: \"30909a2b-a27c-4b44-8a1b-c23e90999d15\") " pod="openshift-multus/multus-additional-cni-plugins-9pgnh" Apr 16 19:18:15.077384 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.077085 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-host-slash\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.077384 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.077088 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fad29fdb-1d79-448d-b40e-0652c1dcf698-hosts-file\") pod \"node-resolver-8khcf\" (UID: \"fad29fdb-1d79-448d-b40e-0652c1dcf698\") " pod="openshift-dns/node-resolver-8khcf" Apr 16 19:18:15.077384 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.077132 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0244c6e9-6611-4147-8e78-0345faffa52e-ovnkube-script-lib\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.077384 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.077165 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1127c8b7-d0b3-4fc3-8097-3548deff71b5-etc-selinux\") pod \"aws-ebs-csi-driver-node-d62pc\" (UID: \"1127c8b7-d0b3-4fc3-8097-3548deff71b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d62pc" Apr 16 19:18:15.077384 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.077209 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/23357efe-6fd5-4b07-ba55-0725ab46c062-etc-sysconfig\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:15.077384 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.077258 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-run-systemd\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.077384 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.077311 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9pf6t\" (UniqueName: \"kubernetes.io/projected/575af952-88cc-43f1-b1c1-1f28a0549971-kube-api-access-9pf6t\") pod \"iptables-alerter-fvvfr\" (UID: \"575af952-88cc-43f1-b1c1-1f28a0549971\") " pod="openshift-network-operator/iptables-alerter-fvvfr" Apr 16 19:18:15.077384 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.077340 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1127c8b7-d0b3-4fc3-8097-3548deff71b5-device-dir\") pod \"aws-ebs-csi-driver-node-d62pc\" (UID: \"1127c8b7-d0b3-4fc3-8097-3548deff71b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d62pc" Apr 16 19:18:15.077384 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.077372 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-host-run-netns\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:15.077995 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.077402 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-multus-conf-dir\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:15.077995 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.077427 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/23357efe-6fd5-4b07-ba55-0725ab46c062-etc-modprobe-d\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:15.077995 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.077450 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/23357efe-6fd5-4b07-ba55-0725ab46c062-lib-modules\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:15.077995 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.077454 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0244c6e9-6611-4147-8e78-0345faffa52e-ovnkube-config\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.077995 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.077464 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/575af952-88cc-43f1-b1c1-1f28a0549971-iptables-alerter-script\") pod \"iptables-alerter-fvvfr\" (UID: \"575af952-88cc-43f1-b1c1-1f28a0549971\") " pod="openshift-network-operator/iptables-alerter-fvvfr" Apr 16 19:18:15.077995 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.077468 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/23357efe-6fd5-4b07-ba55-0725ab46c062-etc-sysconfig\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:15.077995 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.077502 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/30909a2b-a27c-4b44-8a1b-c23e90999d15-cni-binary-copy\") pod \"multus-additional-cni-plugins-9pgnh\" (UID: \"30909a2b-a27c-4b44-8a1b-c23e90999d15\") " pod="openshift-multus/multus-additional-cni-plugins-9pgnh" Apr 16 19:18:15.077995 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.077510 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-host-slash\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.077995 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.077529 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8a90f765-2c10-429d-99f6-bbcf7122c7a0-serviceca\") pod \"node-ca-zmn8r\" (UID: \"8a90f765-2c10-429d-99f6-bbcf7122c7a0\") " pod="openshift-image-registry/node-ca-zmn8r" Apr 16 19:18:15.077995 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.077561 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-etc-openvswitch\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.077995 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.077570 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-host-run-netns\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:15.077995 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.077584 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1127c8b7-d0b3-4fc3-8097-3548deff71b5-etc-selinux\") pod \"aws-ebs-csi-driver-node-d62pc\" (UID: \"1127c8b7-d0b3-4fc3-8097-3548deff71b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d62pc" Apr 16 19:18:15.077995 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.077660 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/030c4af3-2776-4d57-94f4-7fb7b885e5e4-multus-daemon-config\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:15.077995 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.077651 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1127c8b7-d0b3-4fc3-8097-3548deff71b5-device-dir\") pod \"aws-ebs-csi-driver-node-d62pc\" (UID: \"1127c8b7-d0b3-4fc3-8097-3548deff71b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d62pc" Apr 16 19:18:15.077995 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.077725 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1127c8b7-d0b3-4fc3-8097-3548deff71b5-sys-fs\") pod \"aws-ebs-csi-driver-node-d62pc\" (UID: \"1127c8b7-d0b3-4fc3-8097-3548deff71b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d62pc" Apr 16 19:18:15.077995 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.077760 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fad29fdb-1d79-448d-b40e-0652c1dcf698-tmp-dir\") pod \"node-resolver-8khcf\" (UID: \"fad29fdb-1d79-448d-b40e-0652c1dcf698\") " pod="openshift-dns/node-resolver-8khcf" Apr 16 19:18:15.077995 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.077796 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/23357efe-6fd5-4b07-ba55-0725ab46c062-lib-modules\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:15.077995 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.077813 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-run-systemd\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.078804 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.077853 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-multus-conf-dir\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:15.078804 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.077861 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/30909a2b-a27c-4b44-8a1b-c23e90999d15-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9pgnh\" (UID: \"30909a2b-a27c-4b44-8a1b-c23e90999d15\") " pod="openshift-multus/multus-additional-cni-plugins-9pgnh" Apr 16 19:18:15.078804 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.077924 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-host-run-ovn-kubernetes\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.078804 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.077944 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/23357efe-6fd5-4b07-ba55-0725ab46c062-etc-modprobe-d\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:15.078804 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.077951 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cs5qt\" (UniqueName: \"kubernetes.io/projected/1127c8b7-d0b3-4fc3-8097-3548deff71b5-kube-api-access-cs5qt\") pod \"aws-ebs-csi-driver-node-d62pc\" (UID: \"1127c8b7-d0b3-4fc3-8097-3548deff71b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d62pc" Apr 16 19:18:15.078804 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.077999 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-etc-openvswitch\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.078804 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.078111 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-hostroot\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:15.078804 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.078139 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/30909a2b-a27c-4b44-8a1b-c23e90999d15-cnibin\") pod \"multus-additional-cni-plugins-9pgnh\" (UID: \"30909a2b-a27c-4b44-8a1b-c23e90999d15\") " pod="openshift-multus/multus-additional-cni-plugins-9pgnh" Apr 16 19:18:15.078804 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.078173 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8a90f765-2c10-429d-99f6-bbcf7122c7a0-serviceca\") pod \"node-ca-zmn8r\" (UID: \"8a90f765-2c10-429d-99f6-bbcf7122c7a0\") " pod="openshift-image-registry/node-ca-zmn8r" Apr 16 19:18:15.078804 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.078163 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/30909a2b-a27c-4b44-8a1b-c23e90999d15-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9pgnh\" (UID: \"30909a2b-a27c-4b44-8a1b-c23e90999d15\") " pod="openshift-multus/multus-additional-cni-plugins-9pgnh" Apr 16 19:18:15.078804 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.078201 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0244c6e9-6611-4147-8e78-0345faffa52e-ovnkube-script-lib\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.078804 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.078241 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1127c8b7-d0b3-4fc3-8097-3548deff71b5-sys-fs\") pod \"aws-ebs-csi-driver-node-d62pc\" (UID: \"1127c8b7-d0b3-4fc3-8097-3548deff71b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d62pc" Apr 16 19:18:15.078804 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.078288 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-host-run-ovn-kubernetes\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.078804 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.078306 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0244c6e9-6611-4147-8e78-0345faffa52e-ovn-node-metrics-cert\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.078804 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.078326 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-hostroot\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:15.078804 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.078366 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c79c97f4-34fe-4b2b-9f22-401688c77d79-metrics-certs\") pod \"network-metrics-daemon-hj7p9\" (UID: \"c79c97f4-34fe-4b2b-9f22-401688c77d79\") " pod="openshift-multus/network-metrics-daemon-hj7p9" Apr 16 19:18:15.078804 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.078373 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/30909a2b-a27c-4b44-8a1b-c23e90999d15-cnibin\") pod \"multus-additional-cni-plugins-9pgnh\" (UID: \"30909a2b-a27c-4b44-8a1b-c23e90999d15\") " pod="openshift-multus/multus-additional-cni-plugins-9pgnh" Apr 16 19:18:15.079562 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.078514 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7pv9b\" (UniqueName: \"kubernetes.io/projected/37161dea-24de-48c4-8bf3-490c4f209803-kube-api-access-7pv9b\") pod \"network-check-target-zsg8r\" (UID: \"37161dea-24de-48c4-8bf3-490c4f209803\") " pod="openshift-network-diagnostics/network-check-target-zsg8r" Apr 16 19:18:15.079562 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.078547 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a90f765-2c10-429d-99f6-bbcf7122c7a0-host\") pod \"node-ca-zmn8r\" (UID: \"8a90f765-2c10-429d-99f6-bbcf7122c7a0\") " pod="openshift-image-registry/node-ca-zmn8r" Apr 16 19:18:15.079562 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.078571 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-host-run-multus-certs\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:15.079562 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.078597 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nwt2\" (UniqueName: \"kubernetes.io/projected/fad29fdb-1d79-448d-b40e-0652c1dcf698-kube-api-access-8nwt2\") pod \"node-resolver-8khcf\" (UID: \"fad29fdb-1d79-448d-b40e-0652c1dcf698\") " pod="openshift-dns/node-resolver-8khcf" Apr 16 19:18:15.079562 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.078639 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/30909a2b-a27c-4b44-8a1b-c23e90999d15-cni-binary-copy\") pod \"multus-additional-cni-plugins-9pgnh\" (UID: \"30909a2b-a27c-4b44-8a1b-c23e90999d15\") " pod="openshift-multus/multus-additional-cni-plugins-9pgnh" Apr 16 19:18:15.079562 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.078651 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-host-run-netns\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.079562 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.078716 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-host-run-multus-certs\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:15.079562 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.078696 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-node-log\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.079562 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.078755 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x58ts\" (UniqueName: \"kubernetes.io/projected/c79c97f4-34fe-4b2b-9f22-401688c77d79-kube-api-access-x58ts\") pod \"network-metrics-daemon-hj7p9\" (UID: \"c79c97f4-34fe-4b2b-9f22-401688c77d79\") " pod="openshift-multus/network-metrics-daemon-hj7p9" Apr 16 19:18:15.079562 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.078780 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-system-cni-dir\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:15.079562 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.078785 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a90f765-2c10-429d-99f6-bbcf7122c7a0-host\") pod \"node-ca-zmn8r\" (UID: \"8a90f765-2c10-429d-99f6-bbcf7122c7a0\") " pod="openshift-image-registry/node-ca-zmn8r" Apr 16 19:18:15.079562 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.078805 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-multus-cni-dir\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:15.079562 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.078809 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/30909a2b-a27c-4b44-8a1b-c23e90999d15-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9pgnh\" (UID: \"30909a2b-a27c-4b44-8a1b-c23e90999d15\") " pod="openshift-multus/multus-additional-cni-plugins-9pgnh" Apr 16 19:18:15.079562 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.078828 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-host-run-netns\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.079562 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:15.078574 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:18:15.079562 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.078866 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/30909a2b-a27c-4b44-8a1b-c23e90999d15-system-cni-dir\") pod \"multus-additional-cni-plugins-9pgnh\" (UID: \"30909a2b-a27c-4b44-8a1b-c23e90999d15\") " pod="openshift-multus/multus-additional-cni-plugins-9pgnh" Apr 16 19:18:15.079562 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.078895 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-run-ovn\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.079562 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.078903 2580 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 19:18:15.080379 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:15.078942 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c79c97f4-34fe-4b2b-9f22-401688c77d79-metrics-certs podName:c79c97f4-34fe-4b2b-9f22-401688c77d79 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:15.57890769 +0000 UTC m=+3.066105515 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c79c97f4-34fe-4b2b-9f22-401688c77d79-metrics-certs") pod "network-metrics-daemon-hj7p9" (UID: "c79c97f4-34fe-4b2b-9f22-401688c77d79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:18:15.080379 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.078944 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-run-ovn\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.080379 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.078923 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fad29fdb-1d79-448d-b40e-0652c1dcf698-tmp-dir\") pod \"node-resolver-8khcf\" (UID: \"fad29fdb-1d79-448d-b40e-0652c1dcf698\") " pod="openshift-dns/node-resolver-8khcf" Apr 16 19:18:15.080379 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.078981 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1127c8b7-d0b3-4fc3-8097-3548deff71b5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-d62pc\" (UID: \"1127c8b7-d0b3-4fc3-8097-3548deff71b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d62pc" Apr 16 19:18:15.080379 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.079004 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-node-log\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.080379 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.079082 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/030c4af3-2776-4d57-94f4-7fb7b885e5e4-cni-binary-copy\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:15.080379 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.079129 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/30909a2b-a27c-4b44-8a1b-c23e90999d15-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9pgnh\" (UID: \"30909a2b-a27c-4b44-8a1b-c23e90999d15\") " pod="openshift-multus/multus-additional-cni-plugins-9pgnh" Apr 16 19:18:15.080379 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.079159 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-multus-cni-dir\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:15.080379 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.079140 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-host-var-lib-kubelet\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:15.080379 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.079216 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/30909a2b-a27c-4b44-8a1b-c23e90999d15-system-cni-dir\") pod \"multus-additional-cni-plugins-9pgnh\" (UID: \"30909a2b-a27c-4b44-8a1b-c23e90999d15\") " pod="openshift-multus/multus-additional-cni-plugins-9pgnh" Apr 16 19:18:15.080379 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.079219 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-host-var-lib-kubelet\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:15.080379 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.079239 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sffq4\" (UniqueName: \"kubernetes.io/projected/030c4af3-2776-4d57-94f4-7fb7b885e5e4-kube-api-access-sffq4\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:15.080379 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.079271 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1127c8b7-d0b3-4fc3-8097-3548deff71b5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-d62pc\" (UID: \"1127c8b7-d0b3-4fc3-8097-3548deff71b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d62pc" Apr 16 19:18:15.080379 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.079276 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/23357efe-6fd5-4b07-ba55-0725ab46c062-etc-tuned\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:15.080379 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.079321 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-system-cni-dir\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:15.080379 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.079329 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-log-socket\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.080379 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.079365 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-log-socket\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.081110 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.079367 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0244c6e9-6611-4147-8e78-0345faffa52e-env-overrides\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.081110 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.079505 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-os-release\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:15.081110 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.079531 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/23357efe-6fd5-4b07-ba55-0725ab46c062-etc-systemd\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:15.081110 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.079558 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/30909a2b-a27c-4b44-8a1b-c23e90999d15-os-release\") pod \"multus-additional-cni-plugins-9pgnh\" (UID: \"30909a2b-a27c-4b44-8a1b-c23e90999d15\") " pod="openshift-multus/multus-additional-cni-plugins-9pgnh" Apr 16 19:18:15.081110 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.079604 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-host-cni-bin\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.081110 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.079630 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/575af952-88cc-43f1-b1c1-1f28a0549971-host-slash\") pod \"iptables-alerter-fvvfr\" (UID: \"575af952-88cc-43f1-b1c1-1f28a0549971\") " pod="openshift-network-operator/iptables-alerter-fvvfr" Apr 16 19:18:15.081110 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.079653 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-host-var-lib-cni-multus\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:15.081110 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.079680 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4vwcc\" (UniqueName: \"kubernetes.io/projected/8a90f765-2c10-429d-99f6-bbcf7122c7a0-kube-api-access-4vwcc\") pod \"node-ca-zmn8r\" (UID: \"8a90f765-2c10-429d-99f6-bbcf7122c7a0\") " pod="openshift-image-registry/node-ca-zmn8r" Apr 16 19:18:15.081110 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.079734 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-host-kubelet\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.081110 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.079769 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/30909a2b-a27c-4b44-8a1b-c23e90999d15-os-release\") pod \"multus-additional-cni-plugins-9pgnh\" (UID: \"30909a2b-a27c-4b44-8a1b-c23e90999d15\") " pod="openshift-multus/multus-additional-cni-plugins-9pgnh" Apr 16 19:18:15.081110 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.079791 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-systemd-units\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.081110 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.079802 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0244c6e9-6611-4147-8e78-0345faffa52e-env-overrides\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.081110 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.079838 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-run-openvswitch\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.081110 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.079844 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/030c4af3-2776-4d57-94f4-7fb7b885e5e4-cni-binary-copy\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:15.081110 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.079830 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/23357efe-6fd5-4b07-ba55-0725ab46c062-etc-systemd\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:15.081110 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.079864 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-host-cni-bin\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.081110 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.079905 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-host-kubelet\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.081110 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.079911 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-host-var-lib-cni-multus\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:15.081872 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.079878 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.081872 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.079958 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-run-openvswitch\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.081872 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.079922 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-systemd-units\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.081872 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.079934 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-os-release\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:15.081872 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.079960 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f26e651f-cca2-4b49-b9c1-af63f23ad901-agent-certs\") pod \"konnectivity-agent-5zbnv\" (UID: \"f26e651f-cca2-4b49-b9c1-af63f23ad901\") " pod="kube-system/konnectivity-agent-5zbnv" Apr 16 19:18:15.081872 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.080025 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1127c8b7-d0b3-4fc3-8097-3548deff71b5-socket-dir\") pod \"aws-ebs-csi-driver-node-d62pc\" (UID: \"1127c8b7-d0b3-4fc3-8097-3548deff71b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d62pc" Apr 16 19:18:15.081872 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.079966 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.081872 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.080090 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23357efe-6fd5-4b07-ba55-0725ab46c062-etc-kubernetes\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:15.081872 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.080156 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-var-lib-openvswitch\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.081872 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.080214 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-host-cni-netd\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.081872 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.080261 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23357efe-6fd5-4b07-ba55-0725ab46c062-etc-kubernetes\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:15.081872 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.080285 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-var-lib-openvswitch\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.081872 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.080313 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1127c8b7-d0b3-4fc3-8097-3548deff71b5-socket-dir\") pod \"aws-ebs-csi-driver-node-d62pc\" (UID: \"1127c8b7-d0b3-4fc3-8097-3548deff71b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d62pc" Apr 16 19:18:15.081872 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.079801 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/575af952-88cc-43f1-b1c1-1f28a0549971-host-slash\") pod \"iptables-alerter-fvvfr\" (UID: \"575af952-88cc-43f1-b1c1-1f28a0549971\") " pod="openshift-network-operator/iptables-alerter-fvvfr" Apr 16 19:18:15.081872 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.080361 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-swlwr\" (UniqueName: \"kubernetes.io/projected/0244c6e9-6611-4147-8e78-0345faffa52e-kube-api-access-swlwr\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.081872 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.080404 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f26e651f-cca2-4b49-b9c1-af63f23ad901-konnectivity-ca\") pod \"konnectivity-agent-5zbnv\" (UID: \"f26e651f-cca2-4b49-b9c1-af63f23ad901\") " pod="kube-system/konnectivity-agent-5zbnv" Apr 16 19:18:15.081872 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.080436 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-multus-socket-dir-parent\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:15.082617 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.080431 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0244c6e9-6611-4147-8e78-0345faffa52e-host-cni-netd\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.082617 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.080466 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-host-run-k8s-cni-cncf-io\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:15.082617 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.080519 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/23357efe-6fd5-4b07-ba55-0725ab46c062-etc-sysctl-conf\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:15.082617 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.080528 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-multus-socket-dir-parent\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:15.082617 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.080567 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/23357efe-6fd5-4b07-ba55-0725ab46c062-run\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:15.082617 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.080562 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-host-run-k8s-cni-cncf-io\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:15.082617 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.080597 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23357efe-6fd5-4b07-ba55-0725ab46c062-host\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:15.082617 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.080657 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/23357efe-6fd5-4b07-ba55-0725ab46c062-etc-sysctl-conf\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:15.082617 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.080647 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-cnibin\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:15.082617 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.080706 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-etc-kubernetes\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:15.082617 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.080707 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23357efe-6fd5-4b07-ba55-0725ab46c062-host\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:15.082617 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.080711 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-cnibin\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:15.082617 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.080635 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/23357efe-6fd5-4b07-ba55-0725ab46c062-run\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:15.082617 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.080731 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/23357efe-6fd5-4b07-ba55-0725ab46c062-tmp\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:15.082617 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.080769 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/30909a2b-a27c-4b44-8a1b-c23e90999d15-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9pgnh\" (UID: \"30909a2b-a27c-4b44-8a1b-c23e90999d15\") " pod="openshift-multus/multus-additional-cni-plugins-9pgnh" Apr 16 19:18:15.082617 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.080778 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/030c4af3-2776-4d57-94f4-7fb7b885e5e4-etc-kubernetes\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:15.082617 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.081918 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/30909a2b-a27c-4b44-8a1b-c23e90999d15-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9pgnh\" (UID: \"30909a2b-a27c-4b44-8a1b-c23e90999d15\") " pod="openshift-multus/multus-additional-cni-plugins-9pgnh" Apr 16 19:18:15.082617 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.082253 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f26e651f-cca2-4b49-b9c1-af63f23ad901-konnectivity-ca\") pod \"konnectivity-agent-5zbnv\" (UID: \"f26e651f-cca2-4b49-b9c1-af63f23ad901\") " pod="kube-system/konnectivity-agent-5zbnv" Apr 16 19:18:15.086536 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.083874 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0244c6e9-6611-4147-8e78-0345faffa52e-ovn-node-metrics-cert\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.086536 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.083937 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f26e651f-cca2-4b49-b9c1-af63f23ad901-agent-certs\") pod \"konnectivity-agent-5zbnv\" (UID: \"f26e651f-cca2-4b49-b9c1-af63f23ad901\") " pod="kube-system/konnectivity-agent-5zbnv" Apr 16 19:18:15.086536 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.084514 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/23357efe-6fd5-4b07-ba55-0725ab46c062-etc-tuned\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:15.086536 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.084669 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/23357efe-6fd5-4b07-ba55-0725ab46c062-tmp\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:15.089660 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.089631 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pf6t\" (UniqueName: \"kubernetes.io/projected/575af952-88cc-43f1-b1c1-1f28a0549971-kube-api-access-9pf6t\") pod \"iptables-alerter-fvvfr\" (UID: \"575af952-88cc-43f1-b1c1-1f28a0549971\") " pod="openshift-network-operator/iptables-alerter-fvvfr" Apr 16 19:18:15.089894 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:15.089869 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:18:15.089962 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:15.089902 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:18:15.089962 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:15.089918 2580 projected.go:194] Error preparing data for projected volume kube-api-access-7pv9b for pod openshift-network-diagnostics/network-check-target-zsg8r: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:18:15.090085 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:15.089979 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37161dea-24de-48c4-8bf3-490c4f209803-kube-api-access-7pv9b podName:37161dea-24de-48c4-8bf3-490c4f209803 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:15.589960821 +0000 UTC m=+3.077158644 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-7pv9b" (UniqueName: "kubernetes.io/projected/37161dea-24de-48c4-8bf3-490c4f209803-kube-api-access-7pv9b") pod "network-check-target-zsg8r" (UID: "37161dea-24de-48c4-8bf3-490c4f209803") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:18:15.090846 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.090825 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxx7r\" (UniqueName: \"kubernetes.io/projected/30909a2b-a27c-4b44-8a1b-c23e90999d15-kube-api-access-bxx7r\") pod \"multus-additional-cni-plugins-9pgnh\" (UID: \"30909a2b-a27c-4b44-8a1b-c23e90999d15\") " pod="openshift-multus/multus-additional-cni-plugins-9pgnh" Apr 16 19:18:15.090976 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.090952 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gqhc\" (UniqueName: \"kubernetes.io/projected/23357efe-6fd5-4b07-ba55-0725ab46c062-kube-api-access-7gqhc\") pod \"tuned-jfm84\" (UID: \"23357efe-6fd5-4b07-ba55-0725ab46c062\") " pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:15.091558 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.091505 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vwcc\" (UniqueName: \"kubernetes.io/projected/8a90f765-2c10-429d-99f6-bbcf7122c7a0-kube-api-access-4vwcc\") pod \"node-ca-zmn8r\" (UID: \"8a90f765-2c10-429d-99f6-bbcf7122c7a0\") " pod="openshift-image-registry/node-ca-zmn8r" Apr 16 19:18:15.092087 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.092066 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs5qt\" (UniqueName: \"kubernetes.io/projected/1127c8b7-d0b3-4fc3-8097-3548deff71b5-kube-api-access-cs5qt\") pod \"aws-ebs-csi-driver-node-d62pc\" (UID: \"1127c8b7-d0b3-4fc3-8097-3548deff71b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d62pc" Apr 16 19:18:15.092495 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.092474 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x58ts\" (UniqueName: \"kubernetes.io/projected/c79c97f4-34fe-4b2b-9f22-401688c77d79-kube-api-access-x58ts\") pod \"network-metrics-daemon-hj7p9\" (UID: \"c79c97f4-34fe-4b2b-9f22-401688c77d79\") " pod="openshift-multus/network-metrics-daemon-hj7p9" Apr 16 19:18:15.092583 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.092567 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sffq4\" (UniqueName: \"kubernetes.io/projected/030c4af3-2776-4d57-94f4-7fb7b885e5e4-kube-api-access-sffq4\") pod \"multus-pjfwp\" (UID: \"030c4af3-2776-4d57-94f4-7fb7b885e5e4\") " pod="openshift-multus/multus-pjfwp" Apr 16 19:18:15.092678 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.092661 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nwt2\" (UniqueName: \"kubernetes.io/projected/fad29fdb-1d79-448d-b40e-0652c1dcf698-kube-api-access-8nwt2\") pod \"node-resolver-8khcf\" (UID: \"fad29fdb-1d79-448d-b40e-0652c1dcf698\") " pod="openshift-dns/node-resolver-8khcf" Apr 16 19:18:15.092920 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.092901 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-swlwr\" (UniqueName: \"kubernetes.io/projected/0244c6e9-6611-4147-8e78-0345faffa52e-kube-api-access-swlwr\") pod \"ovnkube-node-q4qdv\" (UID: \"0244c6e9-6611-4147-8e78-0345faffa52e\") " pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.258269 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.258227 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pjfwp" Apr 16 19:18:15.265301 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.265276 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:15.273792 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.273771 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-5zbnv" Apr 16 19:18:15.281351 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.281325 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-fvvfr" Apr 16 19:18:15.289068 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.289044 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jfm84" Apr 16 19:18:15.296742 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.296718 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8khcf" Apr 16 19:18:15.303400 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.303378 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d62pc" Apr 16 19:18:15.309994 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.309970 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zmn8r" Apr 16 19:18:15.315596 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.315578 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9pgnh" Apr 16 19:18:15.466176 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.466143 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:18:15.584753 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.584682 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c79c97f4-34fe-4b2b-9f22-401688c77d79-metrics-certs\") pod \"network-metrics-daemon-hj7p9\" (UID: \"c79c97f4-34fe-4b2b-9f22-401688c77d79\") " pod="openshift-multus/network-metrics-daemon-hj7p9" Apr 16 19:18:15.584879 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:15.584800 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:18:15.584879 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:15.584859 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c79c97f4-34fe-4b2b-9f22-401688c77d79-metrics-certs podName:c79c97f4-34fe-4b2b-9f22-401688c77d79 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:16.584841867 +0000 UTC m=+4.072039679 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c79c97f4-34fe-4b2b-9f22-401688c77d79-metrics-certs") pod "network-metrics-daemon-hj7p9" (UID: "c79c97f4-34fe-4b2b-9f22-401688c77d79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:18:15.624333 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:15.624202 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30909a2b_a27c_4b44_8a1b_c23e90999d15.slice/crio-5a3b4dd84a050c2a83cedc7d188be050cbfa887c5b1db413229ae9ab3c4a5d97 WatchSource:0}: Error finding container 5a3b4dd84a050c2a83cedc7d188be050cbfa887c5b1db413229ae9ab3c4a5d97: Status 404 returned error can't find the container with id 5a3b4dd84a050c2a83cedc7d188be050cbfa887c5b1db413229ae9ab3c4a5d97 Apr 16 19:18:15.625857 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:15.625832 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf26e651f_cca2_4b49_b9c1_af63f23ad901.slice/crio-dfda92b46960faf7f5da4297fcae89365d59ea34103e5d1f46c7607695d91e2f WatchSource:0}: Error finding container dfda92b46960faf7f5da4297fcae89365d59ea34103e5d1f46c7607695d91e2f: Status 404 returned error can't find the container with id dfda92b46960faf7f5da4297fcae89365d59ea34103e5d1f46c7607695d91e2f Apr 16 19:18:15.629515 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:15.629492 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0244c6e9_6611_4147_8e78_0345faffa52e.slice/crio-d395a7d83515688aa1a2d59cc5c15e30be935c1aa72d38bddcf0209169fc7b7e WatchSource:0}: Error finding container d395a7d83515688aa1a2d59cc5c15e30be935c1aa72d38bddcf0209169fc7b7e: Status 404 returned error can't find the container with id d395a7d83515688aa1a2d59cc5c15e30be935c1aa72d38bddcf0209169fc7b7e Apr 16 19:18:15.630809 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:15.630783 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfad29fdb_1d79_448d_b40e_0652c1dcf698.slice/crio-a9ecb9f349c86ef91ad2f556ec603ad1203f843d87095ea1ef1798545662b682 WatchSource:0}: Error finding container a9ecb9f349c86ef91ad2f556ec603ad1203f843d87095ea1ef1798545662b682: Status 404 returned error can't find the container with id a9ecb9f349c86ef91ad2f556ec603ad1203f843d87095ea1ef1798545662b682 Apr 16 19:18:15.631777 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:15.631656 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23357efe_6fd5_4b07_ba55_0725ab46c062.slice/crio-40356b230663f2ef606adf703158e7168d86831da89a8d4d73cca8fee8618ae8 WatchSource:0}: Error finding container 40356b230663f2ef606adf703158e7168d86831da89a8d4d73cca8fee8618ae8: Status 404 returned error can't find the container with id 40356b230663f2ef606adf703158e7168d86831da89a8d4d73cca8fee8618ae8 Apr 16 19:18:15.632968 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:15.632944 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod030c4af3_2776_4d57_94f4_7fb7b885e5e4.slice/crio-0d01392d254a7816fa9576883e43e16f46d86eddf1c58e24489642720227a19a WatchSource:0}: Error finding container 0d01392d254a7816fa9576883e43e16f46d86eddf1c58e24489642720227a19a: Status 404 returned error can't find the container with id 0d01392d254a7816fa9576883e43e16f46d86eddf1c58e24489642720227a19a Apr 16 19:18:15.633689 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:15.633658 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod575af952_88cc_43f1_b1c1_1f28a0549971.slice/crio-934abfca57f9215872cd85051c5026d9fabb140b6dde10ff7e9a66b98796cd29 WatchSource:0}: Error finding container 934abfca57f9215872cd85051c5026d9fabb140b6dde10ff7e9a66b98796cd29: Status 404 returned error can't find the container with id 934abfca57f9215872cd85051c5026d9fabb140b6dde10ff7e9a66b98796cd29 Apr 16 19:18:15.634610 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:15.634585 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1127c8b7_d0b3_4fc3_8097_3548deff71b5.slice/crio-860b601ae075f77248f7583410b4170669634561f8b4d2b43f0ff671f96c9864 WatchSource:0}: Error finding container 860b601ae075f77248f7583410b4170669634561f8b4d2b43f0ff671f96c9864: Status 404 returned error can't find the container with id 860b601ae075f77248f7583410b4170669634561f8b4d2b43f0ff671f96c9864 Apr 16 19:18:15.685032 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:15.685006 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7pv9b\" (UniqueName: \"kubernetes.io/projected/37161dea-24de-48c4-8bf3-490c4f209803-kube-api-access-7pv9b\") pod \"network-check-target-zsg8r\" (UID: \"37161dea-24de-48c4-8bf3-490c4f209803\") " pod="openshift-network-diagnostics/network-check-target-zsg8r" Apr 16 19:18:15.685160 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:15.685126 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:18:15.685160 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:15.685146 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:18:15.685160 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:15.685159 2580 projected.go:194] Error preparing data for projected volume kube-api-access-7pv9b for pod openshift-network-diagnostics/network-check-target-zsg8r: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:18:15.685345 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:15.685234 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37161dea-24de-48c4-8bf3-490c4f209803-kube-api-access-7pv9b podName:37161dea-24de-48c4-8bf3-490c4f209803 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:16.685213646 +0000 UTC m=+4.172411470 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-7pv9b" (UniqueName: "kubernetes.io/projected/37161dea-24de-48c4-8bf3-490c4f209803-kube-api-access-7pv9b") pod "network-check-target-zsg8r" (UID: "37161dea-24de-48c4-8bf3-490c4f209803") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:18:16.003608 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:16.003567 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 19:13:13 +0000 UTC" deadline="2027-12-06 09:19:31.713055463 +0000 UTC" Apr 16 19:18:16.003608 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:16.003605 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14366h1m15.70945371s" Apr 16 19:18:16.095231 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:16.095158 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-83.ec2.internal" event={"ID":"058aac4aaec6cd6ac195597fdcd3d1b7","Type":"ContainerStarted","Data":"8d131303e381bdfa5916c23dc18993eaac9ac04bbe78b8e0fa2b10c4968ba2aa"} Apr 16 19:18:16.098964 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:16.098930 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d62pc" event={"ID":"1127c8b7-d0b3-4fc3-8097-3548deff71b5","Type":"ContainerStarted","Data":"860b601ae075f77248f7583410b4170669634561f8b4d2b43f0ff671f96c9864"} Apr 16 19:18:16.101925 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:16.101895 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-fvvfr" event={"ID":"575af952-88cc-43f1-b1c1-1f28a0549971","Type":"ContainerStarted","Data":"934abfca57f9215872cd85051c5026d9fabb140b6dde10ff7e9a66b98796cd29"} Apr 16 19:18:16.103957 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:16.103899 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jfm84" event={"ID":"23357efe-6fd5-4b07-ba55-0725ab46c062","Type":"ContainerStarted","Data":"40356b230663f2ef606adf703158e7168d86831da89a8d4d73cca8fee8618ae8"} Apr 16 19:18:16.115284 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:16.115254 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" event={"ID":"0244c6e9-6611-4147-8e78-0345faffa52e","Type":"ContainerStarted","Data":"d395a7d83515688aa1a2d59cc5c15e30be935c1aa72d38bddcf0209169fc7b7e"} Apr 16 19:18:16.121930 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:16.121898 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-5zbnv" event={"ID":"f26e651f-cca2-4b49-b9c1-af63f23ad901","Type":"ContainerStarted","Data":"dfda92b46960faf7f5da4297fcae89365d59ea34103e5d1f46c7607695d91e2f"} Apr 16 19:18:16.125018 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:16.124941 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9pgnh" event={"ID":"30909a2b-a27c-4b44-8a1b-c23e90999d15","Type":"ContainerStarted","Data":"5a3b4dd84a050c2a83cedc7d188be050cbfa887c5b1db413229ae9ab3c4a5d97"} Apr 16 19:18:16.128610 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:16.128569 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zmn8r" event={"ID":"8a90f765-2c10-429d-99f6-bbcf7122c7a0","Type":"ContainerStarted","Data":"3c63e89e764f4872bbdaabcac5a5988edccc4b6097b1999c257b39afc83c86b0"} Apr 16 19:18:16.133925 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:16.133847 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pjfwp" event={"ID":"030c4af3-2776-4d57-94f4-7fb7b885e5e4","Type":"ContainerStarted","Data":"0d01392d254a7816fa9576883e43e16f46d86eddf1c58e24489642720227a19a"} Apr 16 19:18:16.139730 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:16.139669 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8khcf" event={"ID":"fad29fdb-1d79-448d-b40e-0652c1dcf698","Type":"ContainerStarted","Data":"a9ecb9f349c86ef91ad2f556ec603ad1203f843d87095ea1ef1798545662b682"} Apr 16 19:18:16.593338 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:16.593308 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c79c97f4-34fe-4b2b-9f22-401688c77d79-metrics-certs\") pod \"network-metrics-daemon-hj7p9\" (UID: \"c79c97f4-34fe-4b2b-9f22-401688c77d79\") " pod="openshift-multus/network-metrics-daemon-hj7p9" Apr 16 19:18:16.593834 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:16.593810 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:18:16.593946 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:16.593900 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c79c97f4-34fe-4b2b-9f22-401688c77d79-metrics-certs podName:c79c97f4-34fe-4b2b-9f22-401688c77d79 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:18.593869624 +0000 UTC m=+6.081067449 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c79c97f4-34fe-4b2b-9f22-401688c77d79-metrics-certs") pod "network-metrics-daemon-hj7p9" (UID: "c79c97f4-34fe-4b2b-9f22-401688c77d79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:18:16.694090 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:16.694039 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7pv9b\" (UniqueName: \"kubernetes.io/projected/37161dea-24de-48c4-8bf3-490c4f209803-kube-api-access-7pv9b\") pod \"network-check-target-zsg8r\" (UID: \"37161dea-24de-48c4-8bf3-490c4f209803\") " pod="openshift-network-diagnostics/network-check-target-zsg8r" Apr 16 19:18:16.694276 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:16.694255 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:18:16.694276 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:16.694276 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:18:16.694394 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:16.694290 2580 projected.go:194] Error preparing data for projected volume kube-api-access-7pv9b for pod openshift-network-diagnostics/network-check-target-zsg8r: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:18:16.694394 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:16.694350 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37161dea-24de-48c4-8bf3-490c4f209803-kube-api-access-7pv9b podName:37161dea-24de-48c4-8bf3-490c4f209803 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:18.694331345 +0000 UTC m=+6.181529161 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-7pv9b" (UniqueName: "kubernetes.io/projected/37161dea-24de-48c4-8bf3-490c4f209803-kube-api-access-7pv9b") pod "network-check-target-zsg8r" (UID: "37161dea-24de-48c4-8bf3-490c4f209803") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:18:17.085278 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:17.085183 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hj7p9" Apr 16 19:18:17.085707 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:17.085342 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hj7p9" podUID="c79c97f4-34fe-4b2b-9f22-401688c77d79" Apr 16 19:18:17.086136 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:17.085767 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsg8r" Apr 16 19:18:17.086136 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:17.085873 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsg8r" podUID="37161dea-24de-48c4-8bf3-490c4f209803" Apr 16 19:18:17.154151 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:17.154075 2580 generic.go:358] "Generic (PLEG): container finished" podID="189f219007fb4c6d42f85db8190298b0" containerID="e79acbf532097a5403d8359bc973a86a5645624bdb52dc077bda2b48e03637a0" exitCode=0 Apr 16 19:18:17.155000 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:17.154950 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-83.ec2.internal" event={"ID":"189f219007fb4c6d42f85db8190298b0","Type":"ContainerDied","Data":"e79acbf532097a5403d8359bc973a86a5645624bdb52dc077bda2b48e03637a0"} Apr 16 19:18:17.178739 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:17.178684 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-83.ec2.internal" podStartSLOduration=3.178665316 podStartE2EDuration="3.178665316s" podCreationTimestamp="2026-04-16 19:18:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:18:16.110636198 +0000 UTC m=+3.597834031" watchObservedRunningTime="2026-04-16 19:18:17.178665316 +0000 UTC m=+4.665863151" Apr 16 19:18:18.162465 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:18.162407 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-83.ec2.internal" event={"ID":"189f219007fb4c6d42f85db8190298b0","Type":"ContainerStarted","Data":"74199fc1cab0bf0855542d4d2b1662cb6ab3184c0537a4decb05974b501c463e"} Apr 16 19:18:18.614885 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:18.614839 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c79c97f4-34fe-4b2b-9f22-401688c77d79-metrics-certs\") pod \"network-metrics-daemon-hj7p9\" (UID: \"c79c97f4-34fe-4b2b-9f22-401688c77d79\") " pod="openshift-multus/network-metrics-daemon-hj7p9" Apr 16 19:18:18.615110 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:18.615091 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:18:18.615236 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:18.615201 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c79c97f4-34fe-4b2b-9f22-401688c77d79-metrics-certs podName:c79c97f4-34fe-4b2b-9f22-401688c77d79 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:22.615168426 +0000 UTC m=+10.102366240 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c79c97f4-34fe-4b2b-9f22-401688c77d79-metrics-certs") pod "network-metrics-daemon-hj7p9" (UID: "c79c97f4-34fe-4b2b-9f22-401688c77d79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:18:18.715805 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:18.715754 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7pv9b\" (UniqueName: \"kubernetes.io/projected/37161dea-24de-48c4-8bf3-490c4f209803-kube-api-access-7pv9b\") pod \"network-check-target-zsg8r\" (UID: \"37161dea-24de-48c4-8bf3-490c4f209803\") " pod="openshift-network-diagnostics/network-check-target-zsg8r" Apr 16 19:18:18.715973 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:18.715950 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:18:18.715973 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:18.715971 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:18:18.716102 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:18.715984 2580 projected.go:194] Error preparing data for projected volume kube-api-access-7pv9b for pod openshift-network-diagnostics/network-check-target-zsg8r: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:18:18.716102 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:18.716043 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37161dea-24de-48c4-8bf3-490c4f209803-kube-api-access-7pv9b podName:37161dea-24de-48c4-8bf3-490c4f209803 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:22.716023323 +0000 UTC m=+10.203221133 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-7pv9b" (UniqueName: "kubernetes.io/projected/37161dea-24de-48c4-8bf3-490c4f209803-kube-api-access-7pv9b") pod "network-check-target-zsg8r" (UID: "37161dea-24de-48c4-8bf3-490c4f209803") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:18:19.082673 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:19.082125 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hj7p9" Apr 16 19:18:19.082673 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:19.082289 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hj7p9" podUID="c79c97f4-34fe-4b2b-9f22-401688c77d79" Apr 16 19:18:19.082910 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:19.082699 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsg8r" Apr 16 19:18:19.082910 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:19.082799 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsg8r" podUID="37161dea-24de-48c4-8bf3-490c4f209803" Apr 16 19:18:21.082262 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:21.082220 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hj7p9" Apr 16 19:18:21.082743 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:21.082371 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hj7p9" podUID="c79c97f4-34fe-4b2b-9f22-401688c77d79" Apr 16 19:18:21.082805 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:21.082765 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsg8r" Apr 16 19:18:21.082878 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:21.082859 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsg8r" podUID="37161dea-24de-48c4-8bf3-490c4f209803" Apr 16 19:18:22.650427 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:22.650387 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c79c97f4-34fe-4b2b-9f22-401688c77d79-metrics-certs\") pod \"network-metrics-daemon-hj7p9\" (UID: \"c79c97f4-34fe-4b2b-9f22-401688c77d79\") " pod="openshift-multus/network-metrics-daemon-hj7p9" Apr 16 19:18:22.650913 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:22.650520 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:18:22.650913 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:22.650604 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c79c97f4-34fe-4b2b-9f22-401688c77d79-metrics-certs podName:c79c97f4-34fe-4b2b-9f22-401688c77d79 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:30.650583862 +0000 UTC m=+18.137781694 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c79c97f4-34fe-4b2b-9f22-401688c77d79-metrics-certs") pod "network-metrics-daemon-hj7p9" (UID: "c79c97f4-34fe-4b2b-9f22-401688c77d79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:18:22.751304 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:22.751208 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7pv9b\" (UniqueName: \"kubernetes.io/projected/37161dea-24de-48c4-8bf3-490c4f209803-kube-api-access-7pv9b\") pod \"network-check-target-zsg8r\" (UID: \"37161dea-24de-48c4-8bf3-490c4f209803\") " pod="openshift-network-diagnostics/network-check-target-zsg8r" Apr 16 19:18:22.751480 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:22.751349 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:18:22.751480 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:22.751374 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:18:22.751480 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:22.751387 2580 projected.go:194] Error preparing data for projected volume kube-api-access-7pv9b for pod openshift-network-diagnostics/network-check-target-zsg8r: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:18:22.751480 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:22.751447 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37161dea-24de-48c4-8bf3-490c4f209803-kube-api-access-7pv9b podName:37161dea-24de-48c4-8bf3-490c4f209803 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:30.751427715 +0000 UTC m=+18.238625527 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-7pv9b" (UniqueName: "kubernetes.io/projected/37161dea-24de-48c4-8bf3-490c4f209803-kube-api-access-7pv9b") pod "network-check-target-zsg8r" (UID: "37161dea-24de-48c4-8bf3-490c4f209803") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:18:23.083342 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:23.083300 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hj7p9" Apr 16 19:18:23.083534 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:23.083438 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hj7p9" podUID="c79c97f4-34fe-4b2b-9f22-401688c77d79" Apr 16 19:18:23.083685 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:23.083665 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsg8r" Apr 16 19:18:23.083793 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:23.083756 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsg8r" podUID="37161dea-24de-48c4-8bf3-490c4f209803" Apr 16 19:18:25.081800 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:25.081750 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hj7p9" Apr 16 19:18:25.081800 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:25.081794 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsg8r" Apr 16 19:18:25.082366 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:25.081893 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hj7p9" podUID="c79c97f4-34fe-4b2b-9f22-401688c77d79" Apr 16 19:18:25.082366 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:25.082016 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsg8r" podUID="37161dea-24de-48c4-8bf3-490c4f209803" Apr 16 19:18:27.081841 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:27.081802 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsg8r" Apr 16 19:18:27.081841 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:27.081830 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hj7p9" Apr 16 19:18:27.082360 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:27.081941 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hj7p9" podUID="c79c97f4-34fe-4b2b-9f22-401688c77d79" Apr 16 19:18:27.082360 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:27.082056 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsg8r" podUID="37161dea-24de-48c4-8bf3-490c4f209803" Apr 16 19:18:29.082094 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:29.081877 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hj7p9" Apr 16 19:18:29.082566 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:29.081880 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsg8r" Apr 16 19:18:29.082566 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:29.082226 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hj7p9" podUID="c79c97f4-34fe-4b2b-9f22-401688c77d79" Apr 16 19:18:29.082566 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:29.082299 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsg8r" podUID="37161dea-24de-48c4-8bf3-490c4f209803" Apr 16 19:18:30.710550 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:30.710510 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c79c97f4-34fe-4b2b-9f22-401688c77d79-metrics-certs\") pod \"network-metrics-daemon-hj7p9\" (UID: \"c79c97f4-34fe-4b2b-9f22-401688c77d79\") " pod="openshift-multus/network-metrics-daemon-hj7p9" Apr 16 19:18:30.711077 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:30.710663 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:18:30.711077 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:30.710742 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c79c97f4-34fe-4b2b-9f22-401688c77d79-metrics-certs podName:c79c97f4-34fe-4b2b-9f22-401688c77d79 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:46.71071441 +0000 UTC m=+34.197912220 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c79c97f4-34fe-4b2b-9f22-401688c77d79-metrics-certs") pod "network-metrics-daemon-hj7p9" (UID: "c79c97f4-34fe-4b2b-9f22-401688c77d79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:18:30.811062 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:30.811028 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7pv9b\" (UniqueName: \"kubernetes.io/projected/37161dea-24de-48c4-8bf3-490c4f209803-kube-api-access-7pv9b\") pod \"network-check-target-zsg8r\" (UID: \"37161dea-24de-48c4-8bf3-490c4f209803\") " pod="openshift-network-diagnostics/network-check-target-zsg8r" Apr 16 19:18:30.811237 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:30.811172 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:18:30.811237 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:30.811205 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:18:30.811237 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:30.811217 2580 projected.go:194] Error preparing data for projected volume kube-api-access-7pv9b for pod openshift-network-diagnostics/network-check-target-zsg8r: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:18:30.811381 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:30.811266 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37161dea-24de-48c4-8bf3-490c4f209803-kube-api-access-7pv9b podName:37161dea-24de-48c4-8bf3-490c4f209803 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:46.811253251 +0000 UTC m=+34.298451074 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-7pv9b" (UniqueName: "kubernetes.io/projected/37161dea-24de-48c4-8bf3-490c4f209803-kube-api-access-7pv9b") pod "network-check-target-zsg8r" (UID: "37161dea-24de-48c4-8bf3-490c4f209803") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:18:31.082754 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:31.082677 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hj7p9" Apr 16 19:18:31.082754 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:31.082720 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsg8r" Apr 16 19:18:31.082976 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:31.082816 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hj7p9" podUID="c79c97f4-34fe-4b2b-9f22-401688c77d79" Apr 16 19:18:31.082976 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:31.082947 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsg8r" podUID="37161dea-24de-48c4-8bf3-490c4f209803" Apr 16 19:18:33.083401 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:33.083176 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hj7p9" Apr 16 19:18:33.084207 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:33.083291 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsg8r" Apr 16 19:18:33.084207 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:33.083489 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hj7p9" podUID="c79c97f4-34fe-4b2b-9f22-401688c77d79" Apr 16 19:18:33.084207 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:33.083558 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsg8r" podUID="37161dea-24de-48c4-8bf3-490c4f209803" Apr 16 19:18:33.189846 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:33.189814 2580 generic.go:358] "Generic (PLEG): container finished" podID="30909a2b-a27c-4b44-8a1b-c23e90999d15" containerID="f2e936783e36e23263192742ee747f2255b691c82bede94fdd8f0e02c13616b4" exitCode=0 Apr 16 19:18:33.189960 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:33.189891 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9pgnh" event={"ID":"30909a2b-a27c-4b44-8a1b-c23e90999d15","Type":"ContainerDied","Data":"f2e936783e36e23263192742ee747f2255b691c82bede94fdd8f0e02c13616b4"} Apr 16 19:18:33.194766 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:33.193168 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zmn8r" event={"ID":"8a90f765-2c10-429d-99f6-bbcf7122c7a0","Type":"ContainerStarted","Data":"9d03dd402d2de37898276f6f273f6f384f1fe47db6071bb358a24fe98e52a951"} Apr 16 19:18:33.197727 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:33.197702 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pjfwp" event={"ID":"030c4af3-2776-4d57-94f4-7fb7b885e5e4","Type":"ContainerStarted","Data":"43378f0fd8012bff98d3123762163c19bb48a04c6a110fed9c57fdc2c3b1ae76"} Apr 16 19:18:33.198980 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:33.198957 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8khcf" event={"ID":"fad29fdb-1d79-448d-b40e-0652c1dcf698","Type":"ContainerStarted","Data":"cc74e4cc6cfa6a432203892774b3c5df3d8ed79951b7fd29a0ff4c904ba1a961"} Apr 16 19:18:33.200804 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:33.200784 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d62pc" event={"ID":"1127c8b7-d0b3-4fc3-8097-3548deff71b5","Type":"ContainerStarted","Data":"be549d058fd93c8f5f71e1f53adc12c2405d6819839839604d8fc6baaad12a29"} Apr 16 19:18:33.201955 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:33.201920 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jfm84" event={"ID":"23357efe-6fd5-4b07-ba55-0725ab46c062","Type":"ContainerStarted","Data":"3b691bc5b094b570092415dfaf76b1403b9ed7a509b428d200a5e3980c6994e7"} Apr 16 19:18:33.203919 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:33.203904 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q4qdv_0244c6e9-6611-4147-8e78-0345faffa52e/ovn-acl-logging/0.log" Apr 16 19:18:33.204215 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:33.204174 2580 generic.go:358] "Generic (PLEG): container finished" podID="0244c6e9-6611-4147-8e78-0345faffa52e" containerID="2b1a9dce994ebf57115822c7eeb1832c4cc8cddc7a6fd59b92137c0eaef489e2" exitCode=1 Apr 16 19:18:33.204215 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:33.204209 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" event={"ID":"0244c6e9-6611-4147-8e78-0345faffa52e","Type":"ContainerStarted","Data":"c611ab11c26311c154c845ee5689f9103793479004988004c75ab61dc0b5de83"} Apr 16 19:18:33.204336 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:33.204236 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" event={"ID":"0244c6e9-6611-4147-8e78-0345faffa52e","Type":"ContainerStarted","Data":"0125dc0f714e26223c4878df5336cb88b277296ea69bcb9e8784047995b95ff0"} Apr 16 19:18:33.204336 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:33.204250 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" event={"ID":"0244c6e9-6611-4147-8e78-0345faffa52e","Type":"ContainerDied","Data":"2b1a9dce994ebf57115822c7eeb1832c4cc8cddc7a6fd59b92137c0eaef489e2"} Apr 16 19:18:33.204336 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:33.204265 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" event={"ID":"0244c6e9-6611-4147-8e78-0345faffa52e","Type":"ContainerStarted","Data":"7e806dd8e46ca0a2b42c5c5a7caa827e29a0d9a7f2b08d1428ea88385f08aeef"} Apr 16 19:18:33.205410 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:33.205391 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-5zbnv" event={"ID":"f26e651f-cca2-4b49-b9c1-af63f23ad901","Type":"ContainerStarted","Data":"63c3f1f173693ef393daf338b2e10b56f86c9d2962bad9e0a6d14f5375b46523"} Apr 16 19:18:33.224669 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:33.224629 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-83.ec2.internal" podStartSLOduration=19.224616198 podStartE2EDuration="19.224616198s" podCreationTimestamp="2026-04-16 19:18:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:18:18.181369647 +0000 UTC m=+5.668567480" watchObservedRunningTime="2026-04-16 19:18:33.224616198 +0000 UTC m=+20.711814030" Apr 16 19:18:33.274368 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:33.274324 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-zmn8r" podStartSLOduration=3.51461738 podStartE2EDuration="20.274310634s" podCreationTimestamp="2026-04-16 19:18:13 +0000 UTC" firstStartedPulling="2026-04-16 19:18:15.640421225 +0000 UTC m=+3.127619046" lastFinishedPulling="2026-04-16 19:18:32.400114489 +0000 UTC m=+19.887312300" observedRunningTime="2026-04-16 19:18:33.24569697 +0000 UTC m=+20.732894803" watchObservedRunningTime="2026-04-16 19:18:33.274310634 +0000 UTC m=+20.761508468" Apr 16 19:18:33.274500 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:33.274456 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-pjfwp" podStartSLOduration=3.471204358 podStartE2EDuration="20.274452135s" podCreationTimestamp="2026-04-16 19:18:13 +0000 UTC" firstStartedPulling="2026-04-16 19:18:15.635312985 +0000 UTC m=+3.122510797" lastFinishedPulling="2026-04-16 19:18:32.438560761 +0000 UTC m=+19.925758574" observedRunningTime="2026-04-16 19:18:33.273496755 +0000 UTC m=+20.760694598" watchObservedRunningTime="2026-04-16 19:18:33.274452135 +0000 UTC m=+20.761649967" Apr 16 19:18:33.301903 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:33.301850 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-jfm84" podStartSLOduration=3.534984127 podStartE2EDuration="20.301835317s" podCreationTimestamp="2026-04-16 19:18:13 +0000 UTC" firstStartedPulling="2026-04-16 19:18:15.634564724 +0000 UTC m=+3.121762537" lastFinishedPulling="2026-04-16 19:18:32.401415912 +0000 UTC m=+19.888613727" observedRunningTime="2026-04-16 19:18:33.301410741 +0000 UTC m=+20.788608573" watchObservedRunningTime="2026-04-16 19:18:33.301835317 +0000 UTC m=+20.789033149" Apr 16 19:18:33.319043 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:33.318997 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-8khcf" podStartSLOduration=3.551696916 podStartE2EDuration="20.31898361s" podCreationTimestamp="2026-04-16 19:18:13 +0000 UTC" firstStartedPulling="2026-04-16 19:18:15.632791624 +0000 UTC m=+3.119989439" lastFinishedPulling="2026-04-16 19:18:32.400078317 +0000 UTC m=+19.887276133" observedRunningTime="2026-04-16 19:18:33.31808503 +0000 UTC m=+20.805282862" watchObservedRunningTime="2026-04-16 19:18:33.31898361 +0000 UTC m=+20.806181442" Apr 16 19:18:33.341174 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:33.341115 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-5zbnv" podStartSLOduration=3.5694401559999998 podStartE2EDuration="20.341098149s" podCreationTimestamp="2026-04-16 19:18:13 +0000 UTC" firstStartedPulling="2026-04-16 19:18:15.628067255 +0000 UTC m=+3.115265066" lastFinishedPulling="2026-04-16 19:18:32.399725247 +0000 UTC m=+19.886923059" observedRunningTime="2026-04-16 19:18:33.339709627 +0000 UTC m=+20.826907460" watchObservedRunningTime="2026-04-16 19:18:33.341098149 +0000 UTC m=+20.828295984" Apr 16 19:18:33.914632 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:33.914423 2580 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 19:18:34.028910 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:34.028781 2580 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T19:18:33.914628595Z","UUID":"20df20f6-f931-48f5-a034-5ffe8dfd7762","Handler":null,"Name":"","Endpoint":""} Apr 16 19:18:34.030548 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:34.030519 2580 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 19:18:34.030548 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:34.030554 2580 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 19:18:34.209390 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:34.209350 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d62pc" event={"ID":"1127c8b7-d0b3-4fc3-8097-3548deff71b5","Type":"ContainerStarted","Data":"cec5208c7f7589e9eec77cc9dde1f477af0d9b2770334b241fbd33673586964a"} Apr 16 19:18:34.210878 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:34.210853 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-fvvfr" event={"ID":"575af952-88cc-43f1-b1c1-1f28a0549971","Type":"ContainerStarted","Data":"67da2196ca06b41ba9ff8b895216f410c0a9c7ff4e6e6a24fe71c5a662fa77ec"} Apr 16 19:18:34.213611 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:34.213591 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q4qdv_0244c6e9-6611-4147-8e78-0345faffa52e/ovn-acl-logging/0.log" Apr 16 19:18:34.214052 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:34.214027 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" event={"ID":"0244c6e9-6611-4147-8e78-0345faffa52e","Type":"ContainerStarted","Data":"b7fd6306a53f883cad78d36a75e7fbcda51c14eada11729c7e68f98ca9abebf4"} Apr 16 19:18:34.214138 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:34.214075 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" event={"ID":"0244c6e9-6611-4147-8e78-0345faffa52e","Type":"ContainerStarted","Data":"01e2652a87528a0a5149038a7403c737c5fcec7dd76b970c0216d2e5d18756f4"} Apr 16 19:18:34.227706 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:34.227645 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-fvvfr" podStartSLOduration=4.466005767 podStartE2EDuration="21.22762709s" podCreationTimestamp="2026-04-16 19:18:13 +0000 UTC" firstStartedPulling="2026-04-16 19:18:15.638462814 +0000 UTC m=+3.125660628" lastFinishedPulling="2026-04-16 19:18:32.400084133 +0000 UTC m=+19.887281951" observedRunningTime="2026-04-16 19:18:34.227164179 +0000 UTC m=+21.714362011" watchObservedRunningTime="2026-04-16 19:18:34.22762709 +0000 UTC m=+21.714824923" Apr 16 19:18:35.082131 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:35.082070 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hj7p9" Apr 16 19:18:35.082279 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:35.082082 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsg8r" Apr 16 19:18:35.082279 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:35.082229 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hj7p9" podUID="c79c97f4-34fe-4b2b-9f22-401688c77d79" Apr 16 19:18:35.082399 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:35.082288 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsg8r" podUID="37161dea-24de-48c4-8bf3-490c4f209803" Apr 16 19:18:35.217703 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:35.217617 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d62pc" event={"ID":"1127c8b7-d0b3-4fc3-8097-3548deff71b5","Type":"ContainerStarted","Data":"7244f4cbdd45893703ce30e557f3148f2adfc85406e0d66238c1a6c32da5a432"} Apr 16 19:18:35.237847 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:35.237794 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d62pc" podStartSLOduration=2.915222919 podStartE2EDuration="22.237780006s" podCreationTimestamp="2026-04-16 19:18:13 +0000 UTC" firstStartedPulling="2026-04-16 19:18:15.639095113 +0000 UTC m=+3.126292925" lastFinishedPulling="2026-04-16 19:18:34.961652202 +0000 UTC m=+22.448850012" observedRunningTime="2026-04-16 19:18:35.237335597 +0000 UTC m=+22.724533429" watchObservedRunningTime="2026-04-16 19:18:35.237780006 +0000 UTC m=+22.724977837" Apr 16 19:18:36.224590 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:36.224563 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q4qdv_0244c6e9-6611-4147-8e78-0345faffa52e/ovn-acl-logging/0.log" Apr 16 19:18:36.225090 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:36.224930 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" event={"ID":"0244c6e9-6611-4147-8e78-0345faffa52e","Type":"ContainerStarted","Data":"f663639e9f40932b4454365d55dd290158fee6691c46e062e41de63bb1f2c49a"} Apr 16 19:18:36.889415 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:36.889379 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-5zbnv" Apr 16 19:18:36.890158 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:36.890131 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-5zbnv" Apr 16 19:18:37.082330 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:37.082296 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsg8r" Apr 16 19:18:37.082502 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:37.082297 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hj7p9" Apr 16 19:18:37.082502 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:37.082423 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsg8r" podUID="37161dea-24de-48c4-8bf3-490c4f209803" Apr 16 19:18:37.082502 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:37.082495 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hj7p9" podUID="c79c97f4-34fe-4b2b-9f22-401688c77d79" Apr 16 19:18:38.119339 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:38.119151 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-5zbnv" Apr 16 19:18:38.119880 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:38.119758 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-5zbnv" Apr 16 19:18:38.231306 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:38.231287 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q4qdv_0244c6e9-6611-4147-8e78-0345faffa52e/ovn-acl-logging/0.log" Apr 16 19:18:38.231649 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:38.231627 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" event={"ID":"0244c6e9-6611-4147-8e78-0345faffa52e","Type":"ContainerStarted","Data":"efc9e086b64ec8fd90ebf7a2955456aba09b7165489a8b4996d712b5fb66f6be"} Apr 16 19:18:38.231936 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:38.231916 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:38.232047 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:38.231942 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:38.232102 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:38.232079 2580 scope.go:117] "RemoveContainer" containerID="2b1a9dce994ebf57115822c7eeb1832c4cc8cddc7a6fd59b92137c0eaef489e2" Apr 16 19:18:38.233412 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:38.233389 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9pgnh" event={"ID":"30909a2b-a27c-4b44-8a1b-c23e90999d15","Type":"ContainerStarted","Data":"73ae542f4bf6484c65b42e1f68c0dac24e5a8a0a0e8f178f2076b0351fde6a2c"} Apr 16 19:18:38.255086 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:38.255066 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:39.081966 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:39.081929 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hj7p9" Apr 16 19:18:39.082162 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:39.082077 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hj7p9" podUID="c79c97f4-34fe-4b2b-9f22-401688c77d79" Apr 16 19:18:39.082162 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:39.082138 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsg8r" Apr 16 19:18:39.082294 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:39.082265 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsg8r" podUID="37161dea-24de-48c4-8bf3-490c4f209803" Apr 16 19:18:39.238613 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:39.238588 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q4qdv_0244c6e9-6611-4147-8e78-0345faffa52e/ovn-acl-logging/0.log" Apr 16 19:18:39.239006 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:39.238939 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" event={"ID":"0244c6e9-6611-4147-8e78-0345faffa52e","Type":"ContainerStarted","Data":"2a287bbcc0eb792b92249e14ceac088b0566da068b4193eadb88a91aa05bd2ef"} Apr 16 19:18:39.239335 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:39.239315 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:39.240705 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:39.240681 2580 generic.go:358] "Generic (PLEG): container finished" podID="30909a2b-a27c-4b44-8a1b-c23e90999d15" containerID="73ae542f4bf6484c65b42e1f68c0dac24e5a8a0a0e8f178f2076b0351fde6a2c" exitCode=0 Apr 16 19:18:39.240797 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:39.240710 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9pgnh" event={"ID":"30909a2b-a27c-4b44-8a1b-c23e90999d15","Type":"ContainerDied","Data":"73ae542f4bf6484c65b42e1f68c0dac24e5a8a0a0e8f178f2076b0351fde6a2c"} Apr 16 19:18:39.253281 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:39.253257 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:18:39.272293 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:39.272248 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" podStartSLOduration=9.193842099 podStartE2EDuration="26.272235256s" podCreationTimestamp="2026-04-16 19:18:13 +0000 UTC" firstStartedPulling="2026-04-16 19:18:15.631383406 +0000 UTC m=+3.118581215" lastFinishedPulling="2026-04-16 19:18:32.709776558 +0000 UTC m=+20.196974372" observedRunningTime="2026-04-16 19:18:39.271947792 +0000 UTC m=+26.759145623" watchObservedRunningTime="2026-04-16 19:18:39.272235256 +0000 UTC m=+26.759433087" Apr 16 19:18:39.839293 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:39.839085 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zsg8r"] Apr 16 19:18:39.839450 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:39.839383 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsg8r" Apr 16 19:18:39.839518 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:39.839495 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsg8r" podUID="37161dea-24de-48c4-8bf3-490c4f209803" Apr 16 19:18:39.842155 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:39.842128 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hj7p9"] Apr 16 19:18:39.842284 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:39.842233 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hj7p9" Apr 16 19:18:39.842337 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:39.842319 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hj7p9" podUID="c79c97f4-34fe-4b2b-9f22-401688c77d79" Apr 16 19:18:41.082406 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:41.082371 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hj7p9" Apr 16 19:18:41.082888 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:41.082516 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hj7p9" podUID="c79c97f4-34fe-4b2b-9f22-401688c77d79" Apr 16 19:18:41.246814 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:41.246779 2580 generic.go:358] "Generic (PLEG): container finished" podID="30909a2b-a27c-4b44-8a1b-c23e90999d15" containerID="b1d44fa4d5cb1acc98eabf1a96a856371cc0987e2fd1343843a97d33b4a87e78" exitCode=0 Apr 16 19:18:41.246956 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:41.246830 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9pgnh" event={"ID":"30909a2b-a27c-4b44-8a1b-c23e90999d15","Type":"ContainerDied","Data":"b1d44fa4d5cb1acc98eabf1a96a856371cc0987e2fd1343843a97d33b4a87e78"} Apr 16 19:18:42.081720 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:42.081686 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsg8r" Apr 16 19:18:42.081907 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:42.081793 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsg8r" podUID="37161dea-24de-48c4-8bf3-490c4f209803" Apr 16 19:18:43.083101 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:43.083065 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hj7p9" Apr 16 19:18:43.083556 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:43.083178 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hj7p9" podUID="c79c97f4-34fe-4b2b-9f22-401688c77d79" Apr 16 19:18:43.251968 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:43.251932 2580 generic.go:358] "Generic (PLEG): container finished" podID="30909a2b-a27c-4b44-8a1b-c23e90999d15" containerID="404d4ded5a4d6d621a2d9eecdd478ec336418aadbdd00b676db80bf7b2e01063" exitCode=0 Apr 16 19:18:43.252132 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:43.251996 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9pgnh" event={"ID":"30909a2b-a27c-4b44-8a1b-c23e90999d15","Type":"ContainerDied","Data":"404d4ded5a4d6d621a2d9eecdd478ec336418aadbdd00b676db80bf7b2e01063"} Apr 16 19:18:44.082360 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:44.082328 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsg8r" Apr 16 19:18:44.082529 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:44.082443 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsg8r" podUID="37161dea-24de-48c4-8bf3-490c4f209803" Apr 16 19:18:45.082158 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:45.081923 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hj7p9" Apr 16 19:18:45.082599 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:45.082292 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hj7p9" podUID="c79c97f4-34fe-4b2b-9f22-401688c77d79" Apr 16 19:18:45.214964 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:45.214940 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-8khcf_fad29fdb-1d79-448d-b40e-0652c1dcf698/dns-node-resolver/0.log" Apr 16 19:18:45.364630 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:45.364531 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-83.ec2.internal" event="NodeReady" Apr 16 19:18:45.364791 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:45.364655 2580 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 19:18:45.414974 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:45.414942 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-2pfdm"] Apr 16 19:18:45.435993 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:45.435964 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-6pw99"] Apr 16 19:18:45.436153 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:45.436137 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2pfdm" Apr 16 19:18:45.438659 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:45.438634 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 19:18:45.438841 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:45.438634 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-n84tw\"" Apr 16 19:18:45.438841 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:45.438638 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 19:18:45.461086 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:45.461059 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2pfdm"] Apr 16 19:18:45.461086 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:45.461090 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6pw99"] Apr 16 19:18:45.461300 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:45.461213 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6pw99" Apr 16 19:18:45.464505 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:45.464477 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 19:18:45.464635 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:45.464559 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 19:18:45.464680 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:45.464488 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 19:18:45.467404 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:45.464924 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9bx8r\"" Apr 16 19:18:45.517174 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:45.517141 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kxwh\" (UniqueName: \"kubernetes.io/projected/f5d78334-d61f-4f3e-878c-9726541364d0-kube-api-access-9kxwh\") pod \"dns-default-2pfdm\" (UID: \"f5d78334-d61f-4f3e-878c-9726541364d0\") " pod="openshift-dns/dns-default-2pfdm" Apr 16 19:18:45.517348 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:45.517179 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f5d78334-d61f-4f3e-878c-9726541364d0-tmp-dir\") pod \"dns-default-2pfdm\" (UID: \"f5d78334-d61f-4f3e-878c-9726541364d0\") " pod="openshift-dns/dns-default-2pfdm" Apr 16 19:18:45.517348 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:45.517235 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5d78334-d61f-4f3e-878c-9726541364d0-config-volume\") pod \"dns-default-2pfdm\" (UID: \"f5d78334-d61f-4f3e-878c-9726541364d0\") " pod="openshift-dns/dns-default-2pfdm" Apr 16 19:18:45.517348 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:45.517304 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5d78334-d61f-4f3e-878c-9726541364d0-metrics-tls\") pod \"dns-default-2pfdm\" (UID: \"f5d78334-d61f-4f3e-878c-9726541364d0\") " pod="openshift-dns/dns-default-2pfdm" Apr 16 19:18:45.618239 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:45.618141 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9kxwh\" (UniqueName: \"kubernetes.io/projected/f5d78334-d61f-4f3e-878c-9726541364d0-kube-api-access-9kxwh\") pod \"dns-default-2pfdm\" (UID: \"f5d78334-d61f-4f3e-878c-9726541364d0\") " pod="openshift-dns/dns-default-2pfdm" Apr 16 19:18:45.618239 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:45.618201 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f5d78334-d61f-4f3e-878c-9726541364d0-tmp-dir\") pod \"dns-default-2pfdm\" (UID: \"f5d78334-d61f-4f3e-878c-9726541364d0\") " pod="openshift-dns/dns-default-2pfdm" Apr 16 19:18:45.618239 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:45.618233 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/512ebef3-162f-4664-8e29-302b9cbd3861-cert\") pod \"ingress-canary-6pw99\" (UID: \"512ebef3-162f-4664-8e29-302b9cbd3861\") " pod="openshift-ingress-canary/ingress-canary-6pw99" Apr 16 19:18:45.618507 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:45.618268 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5d78334-d61f-4f3e-878c-9726541364d0-config-volume\") pod \"dns-default-2pfdm\" (UID: \"f5d78334-d61f-4f3e-878c-9726541364d0\") " pod="openshift-dns/dns-default-2pfdm" Apr 16 19:18:45.618507 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:45.618293 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6z52\" (UniqueName: \"kubernetes.io/projected/512ebef3-162f-4664-8e29-302b9cbd3861-kube-api-access-m6z52\") pod \"ingress-canary-6pw99\" (UID: \"512ebef3-162f-4664-8e29-302b9cbd3861\") " pod="openshift-ingress-canary/ingress-canary-6pw99" Apr 16 19:18:45.618507 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:45.618326 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5d78334-d61f-4f3e-878c-9726541364d0-metrics-tls\") pod \"dns-default-2pfdm\" (UID: \"f5d78334-d61f-4f3e-878c-9726541364d0\") " pod="openshift-dns/dns-default-2pfdm" Apr 16 19:18:45.618507 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:45.618481 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:18:45.618737 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:45.618554 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5d78334-d61f-4f3e-878c-9726541364d0-metrics-tls podName:f5d78334-d61f-4f3e-878c-9726541364d0 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:46.118533965 +0000 UTC m=+33.605731796 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f5d78334-d61f-4f3e-878c-9726541364d0-metrics-tls") pod "dns-default-2pfdm" (UID: "f5d78334-d61f-4f3e-878c-9726541364d0") : secret "dns-default-metrics-tls" not found Apr 16 19:18:45.618737 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:45.618618 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f5d78334-d61f-4f3e-878c-9726541364d0-tmp-dir\") pod \"dns-default-2pfdm\" (UID: \"f5d78334-d61f-4f3e-878c-9726541364d0\") " pod="openshift-dns/dns-default-2pfdm" Apr 16 19:18:45.619253 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:45.619230 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5d78334-d61f-4f3e-878c-9726541364d0-config-volume\") pod \"dns-default-2pfdm\" (UID: \"f5d78334-d61f-4f3e-878c-9726541364d0\") " pod="openshift-dns/dns-default-2pfdm" Apr 16 19:18:45.630247 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:45.630223 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kxwh\" (UniqueName: \"kubernetes.io/projected/f5d78334-d61f-4f3e-878c-9726541364d0-kube-api-access-9kxwh\") pod \"dns-default-2pfdm\" (UID: \"f5d78334-d61f-4f3e-878c-9726541364d0\") " pod="openshift-dns/dns-default-2pfdm" Apr 16 19:18:45.719396 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:45.719357 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/512ebef3-162f-4664-8e29-302b9cbd3861-cert\") pod \"ingress-canary-6pw99\" (UID: \"512ebef3-162f-4664-8e29-302b9cbd3861\") " pod="openshift-ingress-canary/ingress-canary-6pw99" Apr 16 19:18:45.719396 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:45.719404 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6z52\" (UniqueName: \"kubernetes.io/projected/512ebef3-162f-4664-8e29-302b9cbd3861-kube-api-access-m6z52\") pod \"ingress-canary-6pw99\" (UID: \"512ebef3-162f-4664-8e29-302b9cbd3861\") " pod="openshift-ingress-canary/ingress-canary-6pw99" Apr 16 19:18:45.719637 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:45.719528 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:18:45.719637 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:45.719600 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/512ebef3-162f-4664-8e29-302b9cbd3861-cert podName:512ebef3-162f-4664-8e29-302b9cbd3861 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:46.21958417 +0000 UTC m=+33.706781983 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/512ebef3-162f-4664-8e29-302b9cbd3861-cert") pod "ingress-canary-6pw99" (UID: "512ebef3-162f-4664-8e29-302b9cbd3861") : secret "canary-serving-cert" not found Apr 16 19:18:45.736951 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:45.736927 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6z52\" (UniqueName: \"kubernetes.io/projected/512ebef3-162f-4664-8e29-302b9cbd3861-kube-api-access-m6z52\") pod \"ingress-canary-6pw99\" (UID: \"512ebef3-162f-4664-8e29-302b9cbd3861\") " pod="openshift-ingress-canary/ingress-canary-6pw99" Apr 16 19:18:46.082575 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:46.082543 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsg8r" Apr 16 19:18:46.085286 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:46.085265 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 19:18:46.085402 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:46.085367 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-s2294\"" Apr 16 19:18:46.085460 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:46.085431 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 19:18:46.122387 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:46.122359 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5d78334-d61f-4f3e-878c-9726541364d0-metrics-tls\") pod \"dns-default-2pfdm\" (UID: \"f5d78334-d61f-4f3e-878c-9726541364d0\") " pod="openshift-dns/dns-default-2pfdm" Apr 16 19:18:46.122558 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:46.122526 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:18:46.122639 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:46.122619 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5d78334-d61f-4f3e-878c-9726541364d0-metrics-tls podName:f5d78334-d61f-4f3e-878c-9726541364d0 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:47.122595709 +0000 UTC m=+34.609793540 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f5d78334-d61f-4f3e-878c-9726541364d0-metrics-tls") pod "dns-default-2pfdm" (UID: "f5d78334-d61f-4f3e-878c-9726541364d0") : secret "dns-default-metrics-tls" not found Apr 16 19:18:46.223554 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:46.223519 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/512ebef3-162f-4664-8e29-302b9cbd3861-cert\") pod \"ingress-canary-6pw99\" (UID: \"512ebef3-162f-4664-8e29-302b9cbd3861\") " pod="openshift-ingress-canary/ingress-canary-6pw99" Apr 16 19:18:46.223722 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:46.223675 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:18:46.223799 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:46.223738 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/512ebef3-162f-4664-8e29-302b9cbd3861-cert podName:512ebef3-162f-4664-8e29-302b9cbd3861 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:47.223718604 +0000 UTC m=+34.710916437 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/512ebef3-162f-4664-8e29-302b9cbd3861-cert") pod "ingress-canary-6pw99" (UID: "512ebef3-162f-4664-8e29-302b9cbd3861") : secret "canary-serving-cert" not found Apr 16 19:18:46.487100 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:46.487058 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-zws62"] Apr 16 19:18:46.515735 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:46.515693 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-zws62"] Apr 16 19:18:46.515907 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:46.515822 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-zws62" Apr 16 19:18:46.518368 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:46.518340 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 19:18:46.518491 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:46.518400 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-s6lmd\"" Apr 16 19:18:46.518559 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:46.518538 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 19:18:46.518639 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:46.518622 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 19:18:46.518690 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:46.518627 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 19:18:46.596248 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:46.596222 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-zmn8r_8a90f765-2c10-429d-99f6-bbcf7122c7a0/node-ca/0.log" Apr 16 19:18:46.626534 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:46.626504 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7b20734d-841f-4fd6-a7d8-84516a97df06-signing-key\") pod \"service-ca-865cb79987-zws62\" (UID: \"7b20734d-841f-4fd6-a7d8-84516a97df06\") " pod="openshift-service-ca/service-ca-865cb79987-zws62" Apr 16 19:18:46.626658 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:46.626541 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttnkm\" (UniqueName: \"kubernetes.io/projected/7b20734d-841f-4fd6-a7d8-84516a97df06-kube-api-access-ttnkm\") pod \"service-ca-865cb79987-zws62\" (UID: \"7b20734d-841f-4fd6-a7d8-84516a97df06\") " pod="openshift-service-ca/service-ca-865cb79987-zws62" Apr 16 19:18:46.626658 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:46.626629 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7b20734d-841f-4fd6-a7d8-84516a97df06-signing-cabundle\") pod \"service-ca-865cb79987-zws62\" (UID: \"7b20734d-841f-4fd6-a7d8-84516a97df06\") " pod="openshift-service-ca/service-ca-865cb79987-zws62" Apr 16 19:18:46.727231 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:46.727183 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7b20734d-841f-4fd6-a7d8-84516a97df06-signing-key\") pod \"service-ca-865cb79987-zws62\" (UID: \"7b20734d-841f-4fd6-a7d8-84516a97df06\") " pod="openshift-service-ca/service-ca-865cb79987-zws62" Apr 16 19:18:46.727394 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:46.727240 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ttnkm\" (UniqueName: \"kubernetes.io/projected/7b20734d-841f-4fd6-a7d8-84516a97df06-kube-api-access-ttnkm\") pod \"service-ca-865cb79987-zws62\" (UID: \"7b20734d-841f-4fd6-a7d8-84516a97df06\") " pod="openshift-service-ca/service-ca-865cb79987-zws62" Apr 16 19:18:46.727394 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:46.727269 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c79c97f4-34fe-4b2b-9f22-401688c77d79-metrics-certs\") pod \"network-metrics-daemon-hj7p9\" (UID: \"c79c97f4-34fe-4b2b-9f22-401688c77d79\") " pod="openshift-multus/network-metrics-daemon-hj7p9" Apr 16 19:18:46.727394 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:46.727306 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7b20734d-841f-4fd6-a7d8-84516a97df06-signing-cabundle\") pod \"service-ca-865cb79987-zws62\" (UID: \"7b20734d-841f-4fd6-a7d8-84516a97df06\") " pod="openshift-service-ca/service-ca-865cb79987-zws62" Apr 16 19:18:46.727564 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:46.727425 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:18:46.727564 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:46.727524 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c79c97f4-34fe-4b2b-9f22-401688c77d79-metrics-certs podName:c79c97f4-34fe-4b2b-9f22-401688c77d79 nodeName:}" failed. No retries permitted until 2026-04-16 19:19:18.727502229 +0000 UTC m=+66.214700055 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c79c97f4-34fe-4b2b-9f22-401688c77d79-metrics-certs") pod "network-metrics-daemon-hj7p9" (UID: "c79c97f4-34fe-4b2b-9f22-401688c77d79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:18:46.728051 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:46.728033 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7b20734d-841f-4fd6-a7d8-84516a97df06-signing-cabundle\") pod \"service-ca-865cb79987-zws62\" (UID: \"7b20734d-841f-4fd6-a7d8-84516a97df06\") " pod="openshift-service-ca/service-ca-865cb79987-zws62" Apr 16 19:18:46.729858 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:46.729841 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7b20734d-841f-4fd6-a7d8-84516a97df06-signing-key\") pod \"service-ca-865cb79987-zws62\" (UID: \"7b20734d-841f-4fd6-a7d8-84516a97df06\") " pod="openshift-service-ca/service-ca-865cb79987-zws62" Apr 16 19:18:46.735967 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:46.735947 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttnkm\" (UniqueName: \"kubernetes.io/projected/7b20734d-841f-4fd6-a7d8-84516a97df06-kube-api-access-ttnkm\") pod \"service-ca-865cb79987-zws62\" (UID: \"7b20734d-841f-4fd6-a7d8-84516a97df06\") " pod="openshift-service-ca/service-ca-865cb79987-zws62" Apr 16 19:18:46.826418 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:46.826340 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-zws62" Apr 16 19:18:46.828292 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:46.828268 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7pv9b\" (UniqueName: \"kubernetes.io/projected/37161dea-24de-48c4-8bf3-490c4f209803-kube-api-access-7pv9b\") pod \"network-check-target-zsg8r\" (UID: \"37161dea-24de-48c4-8bf3-490c4f209803\") " pod="openshift-network-diagnostics/network-check-target-zsg8r" Apr 16 19:18:46.831433 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:46.831406 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pv9b\" (UniqueName: \"kubernetes.io/projected/37161dea-24de-48c4-8bf3-490c4f209803-kube-api-access-7pv9b\") pod \"network-check-target-zsg8r\" (UID: \"37161dea-24de-48c4-8bf3-490c4f209803\") " pod="openshift-network-diagnostics/network-check-target-zsg8r" Apr 16 19:18:46.991860 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:46.991829 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsg8r" Apr 16 19:18:46.996128 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:46.996098 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-zws62"] Apr 16 19:18:47.008506 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:47.008466 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b20734d_841f_4fd6_a7d8_84516a97df06.slice/crio-e3960173c94bfc8c5e2435c00af9f00d5f50182037a456958a3169ccf3498e35 WatchSource:0}: Error finding container e3960173c94bfc8c5e2435c00af9f00d5f50182037a456958a3169ccf3498e35: Status 404 returned error can't find the container with id e3960173c94bfc8c5e2435c00af9f00d5f50182037a456958a3169ccf3498e35 Apr 16 19:18:47.082837 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:47.082763 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hj7p9" Apr 16 19:18:47.085756 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:47.085730 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 19:18:47.085927 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:47.085910 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-t5lg8\"" Apr 16 19:18:47.131889 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:47.131856 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5d78334-d61f-4f3e-878c-9726541364d0-metrics-tls\") pod \"dns-default-2pfdm\" (UID: \"f5d78334-d61f-4f3e-878c-9726541364d0\") " pod="openshift-dns/dns-default-2pfdm" Apr 16 19:18:47.132049 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:47.132006 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:18:47.132109 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:47.132069 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5d78334-d61f-4f3e-878c-9726541364d0-metrics-tls podName:f5d78334-d61f-4f3e-878c-9726541364d0 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:49.132053093 +0000 UTC m=+36.619250907 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f5d78334-d61f-4f3e-878c-9726541364d0-metrics-tls") pod "dns-default-2pfdm" (UID: "f5d78334-d61f-4f3e-878c-9726541364d0") : secret "dns-default-metrics-tls" not found Apr 16 19:18:47.142490 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:47.142459 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zsg8r"] Apr 16 19:18:47.145867 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:18:47.145825 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37161dea_24de_48c4_8bf3_490c4f209803.slice/crio-cd046a2ca3becd5e137aedcad1e7d2934be993e5839945286b4b790ed3a5a19f WatchSource:0}: Error finding container cd046a2ca3becd5e137aedcad1e7d2934be993e5839945286b4b790ed3a5a19f: Status 404 returned error can't find the container with id cd046a2ca3becd5e137aedcad1e7d2934be993e5839945286b4b790ed3a5a19f Apr 16 19:18:47.232568 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:47.232529 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/512ebef3-162f-4664-8e29-302b9cbd3861-cert\") pod \"ingress-canary-6pw99\" (UID: \"512ebef3-162f-4664-8e29-302b9cbd3861\") " pod="openshift-ingress-canary/ingress-canary-6pw99" Apr 16 19:18:47.232715 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:47.232655 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:18:47.232715 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:47.232705 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/512ebef3-162f-4664-8e29-302b9cbd3861-cert podName:512ebef3-162f-4664-8e29-302b9cbd3861 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:49.23269237 +0000 UTC m=+36.719890180 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/512ebef3-162f-4664-8e29-302b9cbd3861-cert") pod "ingress-canary-6pw99" (UID: "512ebef3-162f-4664-8e29-302b9cbd3861") : secret "canary-serving-cert" not found Apr 16 19:18:47.262267 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:47.262234 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zsg8r" event={"ID":"37161dea-24de-48c4-8bf3-490c4f209803","Type":"ContainerStarted","Data":"cd046a2ca3becd5e137aedcad1e7d2934be993e5839945286b4b790ed3a5a19f"} Apr 16 19:18:47.263276 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:47.263243 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-zws62" event={"ID":"7b20734d-841f-4fd6-a7d8-84516a97df06","Type":"ContainerStarted","Data":"e3960173c94bfc8c5e2435c00af9f00d5f50182037a456958a3169ccf3498e35"} Apr 16 19:18:49.150349 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:49.150310 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5d78334-d61f-4f3e-878c-9726541364d0-metrics-tls\") pod \"dns-default-2pfdm\" (UID: \"f5d78334-d61f-4f3e-878c-9726541364d0\") " pod="openshift-dns/dns-default-2pfdm" Apr 16 19:18:49.150764 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:49.150474 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:18:49.150764 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:49.150547 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5d78334-d61f-4f3e-878c-9726541364d0-metrics-tls podName:f5d78334-d61f-4f3e-878c-9726541364d0 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:53.150530162 +0000 UTC m=+40.637727994 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f5d78334-d61f-4f3e-878c-9726541364d0-metrics-tls") pod "dns-default-2pfdm" (UID: "f5d78334-d61f-4f3e-878c-9726541364d0") : secret "dns-default-metrics-tls" not found Apr 16 19:18:49.250901 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:49.250861 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/512ebef3-162f-4664-8e29-302b9cbd3861-cert\") pod \"ingress-canary-6pw99\" (UID: \"512ebef3-162f-4664-8e29-302b9cbd3861\") " pod="openshift-ingress-canary/ingress-canary-6pw99" Apr 16 19:18:49.251073 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:49.251022 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:18:49.251132 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:49.251089 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/512ebef3-162f-4664-8e29-302b9cbd3861-cert podName:512ebef3-162f-4664-8e29-302b9cbd3861 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:53.251070317 +0000 UTC m=+40.738268154 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/512ebef3-162f-4664-8e29-302b9cbd3861-cert") pod "ingress-canary-6pw99" (UID: "512ebef3-162f-4664-8e29-302b9cbd3861") : secret "canary-serving-cert" not found Apr 16 19:18:50.272942 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:50.272711 2580 generic.go:358] "Generic (PLEG): container finished" podID="30909a2b-a27c-4b44-8a1b-c23e90999d15" containerID="d52836fb2425f64d487bdf33cba699a8f2ba4127c238e9191b6f9378b12bf5fa" exitCode=0 Apr 16 19:18:50.272942 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:50.272793 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9pgnh" event={"ID":"30909a2b-a27c-4b44-8a1b-c23e90999d15","Type":"ContainerDied","Data":"d52836fb2425f64d487bdf33cba699a8f2ba4127c238e9191b6f9378b12bf5fa"} Apr 16 19:18:51.276962 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:51.276932 2580 generic.go:358] "Generic (PLEG): container finished" podID="30909a2b-a27c-4b44-8a1b-c23e90999d15" containerID="bc972c89a1322b64a54c95508dab2a19a4ef1db7d50eff242a490f66f347518b" exitCode=0 Apr 16 19:18:51.277357 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:51.276973 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9pgnh" event={"ID":"30909a2b-a27c-4b44-8a1b-c23e90999d15","Type":"ContainerDied","Data":"bc972c89a1322b64a54c95508dab2a19a4ef1db7d50eff242a490f66f347518b"} Apr 16 19:18:52.286332 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:52.286292 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9pgnh" event={"ID":"30909a2b-a27c-4b44-8a1b-c23e90999d15","Type":"ContainerStarted","Data":"75abdaa62ea533843be22cdf79cdd952ea50248319b60489686765187b5e6fc6"} Apr 16 19:18:52.287920 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:52.287892 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-zws62" event={"ID":"7b20734d-841f-4fd6-a7d8-84516a97df06","Type":"ContainerStarted","Data":"2b1a5e1ef0be509650fed79de2cceca1f729cf66cca41442106db0a713ca05fe"} Apr 16 19:18:52.311402 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:52.311343 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-9pgnh" podStartSLOduration=5.455601963 podStartE2EDuration="39.311328325s" podCreationTimestamp="2026-04-16 19:18:13 +0000 UTC" firstStartedPulling="2026-04-16 19:18:15.627220634 +0000 UTC m=+3.114418449" lastFinishedPulling="2026-04-16 19:18:49.482947001 +0000 UTC m=+36.970144811" observedRunningTime="2026-04-16 19:18:52.31049333 +0000 UTC m=+39.797691164" watchObservedRunningTime="2026-04-16 19:18:52.311328325 +0000 UTC m=+39.798526158" Apr 16 19:18:52.326276 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:52.326229 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-zws62" podStartSLOduration=2.112141585 podStartE2EDuration="6.326211999s" podCreationTimestamp="2026-04-16 19:18:46 +0000 UTC" firstStartedPulling="2026-04-16 19:18:47.010854208 +0000 UTC m=+34.498052023" lastFinishedPulling="2026-04-16 19:18:51.22492462 +0000 UTC m=+38.712122437" observedRunningTime="2026-04-16 19:18:52.325496271 +0000 UTC m=+39.812694105" watchObservedRunningTime="2026-04-16 19:18:52.326211999 +0000 UTC m=+39.813409828" Apr 16 19:18:53.180473 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:53.180424 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5d78334-d61f-4f3e-878c-9726541364d0-metrics-tls\") pod \"dns-default-2pfdm\" (UID: \"f5d78334-d61f-4f3e-878c-9726541364d0\") " pod="openshift-dns/dns-default-2pfdm" Apr 16 19:18:53.180676 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:53.180568 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:18:53.180676 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:53.180626 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5d78334-d61f-4f3e-878c-9726541364d0-metrics-tls podName:f5d78334-d61f-4f3e-878c-9726541364d0 nodeName:}" failed. No retries permitted until 2026-04-16 19:19:01.18060738 +0000 UTC m=+48.667805198 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f5d78334-d61f-4f3e-878c-9726541364d0-metrics-tls") pod "dns-default-2pfdm" (UID: "f5d78334-d61f-4f3e-878c-9726541364d0") : secret "dns-default-metrics-tls" not found Apr 16 19:18:53.281127 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:53.281044 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/512ebef3-162f-4664-8e29-302b9cbd3861-cert\") pod \"ingress-canary-6pw99\" (UID: \"512ebef3-162f-4664-8e29-302b9cbd3861\") " pod="openshift-ingress-canary/ingress-canary-6pw99" Apr 16 19:18:53.281285 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:53.281176 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:18:53.281285 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:18:53.281254 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/512ebef3-162f-4664-8e29-302b9cbd3861-cert podName:512ebef3-162f-4664-8e29-302b9cbd3861 nodeName:}" failed. No retries permitted until 2026-04-16 19:19:01.28123621 +0000 UTC m=+48.768434037 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/512ebef3-162f-4664-8e29-302b9cbd3861-cert") pod "ingress-canary-6pw99" (UID: "512ebef3-162f-4664-8e29-302b9cbd3861") : secret "canary-serving-cert" not found Apr 16 19:18:53.292169 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:53.292125 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zsg8r" event={"ID":"37161dea-24de-48c4-8bf3-490c4f209803","Type":"ContainerStarted","Data":"e3700c578bc2e10dc995038df40ef720137aba1eff3b047186c26dd1f82d3842"} Apr 16 19:18:53.292592 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:53.292379 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-zsg8r" Apr 16 19:18:53.310306 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:18:53.310238 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-zsg8r" podStartSLOduration=34.485868976 podStartE2EDuration="40.310216362s" podCreationTimestamp="2026-04-16 19:18:13 +0000 UTC" firstStartedPulling="2026-04-16 19:18:47.148139405 +0000 UTC m=+34.635337216" lastFinishedPulling="2026-04-16 19:18:52.972486785 +0000 UTC m=+40.459684602" observedRunningTime="2026-04-16 19:18:53.309362918 +0000 UTC m=+40.796560750" watchObservedRunningTime="2026-04-16 19:18:53.310216362 +0000 UTC m=+40.797414186" Apr 16 19:19:01.236805 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:01.236758 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5d78334-d61f-4f3e-878c-9726541364d0-metrics-tls\") pod \"dns-default-2pfdm\" (UID: \"f5d78334-d61f-4f3e-878c-9726541364d0\") " pod="openshift-dns/dns-default-2pfdm" Apr 16 19:19:01.240273 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:01.240246 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5d78334-d61f-4f3e-878c-9726541364d0-metrics-tls\") pod \"dns-default-2pfdm\" (UID: \"f5d78334-d61f-4f3e-878c-9726541364d0\") " pod="openshift-dns/dns-default-2pfdm" Apr 16 19:19:01.338304 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:01.338263 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/512ebef3-162f-4664-8e29-302b9cbd3861-cert\") pod \"ingress-canary-6pw99\" (UID: \"512ebef3-162f-4664-8e29-302b9cbd3861\") " pod="openshift-ingress-canary/ingress-canary-6pw99" Apr 16 19:19:01.340773 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:01.340745 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/512ebef3-162f-4664-8e29-302b9cbd3861-cert\") pod \"ingress-canary-6pw99\" (UID: \"512ebef3-162f-4664-8e29-302b9cbd3861\") " pod="openshift-ingress-canary/ingress-canary-6pw99" Apr 16 19:19:01.348674 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:01.348645 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2pfdm" Apr 16 19:19:01.372383 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:01.372355 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6pw99" Apr 16 19:19:01.489264 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:01.489106 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2pfdm"] Apr 16 19:19:01.493604 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:19:01.493573 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5d78334_d61f_4f3e_878c_9726541364d0.slice/crio-d77e780b52cc21bcf06a09b66f5b0a3ab1965ac60eef75648979350177e98aae WatchSource:0}: Error finding container d77e780b52cc21bcf06a09b66f5b0a3ab1965ac60eef75648979350177e98aae: Status 404 returned error can't find the container with id d77e780b52cc21bcf06a09b66f5b0a3ab1965ac60eef75648979350177e98aae Apr 16 19:19:01.510804 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:01.510776 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6pw99"] Apr 16 19:19:01.514992 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:19:01.514959 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod512ebef3_162f_4664_8e29_302b9cbd3861.slice/crio-7fa15219a0ffb6aee80e8e2eaf12e280f4493054b38ea63457d087181caca2a7 WatchSource:0}: Error finding container 7fa15219a0ffb6aee80e8e2eaf12e280f4493054b38ea63457d087181caca2a7: Status 404 returned error can't find the container with id 7fa15219a0ffb6aee80e8e2eaf12e280f4493054b38ea63457d087181caca2a7 Apr 16 19:19:02.312892 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:02.312852 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2pfdm" event={"ID":"f5d78334-d61f-4f3e-878c-9726541364d0","Type":"ContainerStarted","Data":"d77e780b52cc21bcf06a09b66f5b0a3ab1965ac60eef75648979350177e98aae"} Apr 16 19:19:02.314163 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:02.314134 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6pw99" event={"ID":"512ebef3-162f-4664-8e29-302b9cbd3861","Type":"ContainerStarted","Data":"7fa15219a0ffb6aee80e8e2eaf12e280f4493054b38ea63457d087181caca2a7"} Apr 16 19:19:04.321419 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:04.321379 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2pfdm" event={"ID":"f5d78334-d61f-4f3e-878c-9726541364d0","Type":"ContainerStarted","Data":"1a3d60a1807aafdce3906cfb14a85bff9e2c7d9b3280cf9c536a930099684d94"} Apr 16 19:19:04.321419 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:04.321420 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2pfdm" event={"ID":"f5d78334-d61f-4f3e-878c-9726541364d0","Type":"ContainerStarted","Data":"f3009c2b3b8ad11d836d650ab2acfd8e19f90b2032fad05d1601746857801570"} Apr 16 19:19:04.321976 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:04.321531 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-2pfdm" Apr 16 19:19:04.322674 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:04.322655 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6pw99" event={"ID":"512ebef3-162f-4664-8e29-302b9cbd3861","Type":"ContainerStarted","Data":"7e2c86eb6846ee89aed943e6d1159cc58653dc4191e3dd61d3b58b4e7b89e16e"} Apr 16 19:19:04.339488 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:04.339444 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-2pfdm" podStartSLOduration=16.985350709 podStartE2EDuration="19.33943188s" podCreationTimestamp="2026-04-16 19:18:45 +0000 UTC" firstStartedPulling="2026-04-16 19:19:01.495528543 +0000 UTC m=+48.982726358" lastFinishedPulling="2026-04-16 19:19:03.849609704 +0000 UTC m=+51.336807529" observedRunningTime="2026-04-16 19:19:04.338535674 +0000 UTC m=+51.825733506" watchObservedRunningTime="2026-04-16 19:19:04.33943188 +0000 UTC m=+51.826629747" Apr 16 19:19:04.353950 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:04.353903 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-6pw99" podStartSLOduration=17.016353818 podStartE2EDuration="19.353890443s" podCreationTimestamp="2026-04-16 19:18:45 +0000 UTC" firstStartedPulling="2026-04-16 19:19:01.517054446 +0000 UTC m=+49.004252258" lastFinishedPulling="2026-04-16 19:19:03.854591072 +0000 UTC m=+51.341788883" observedRunningTime="2026-04-16 19:19:04.353158238 +0000 UTC m=+51.840356071" watchObservedRunningTime="2026-04-16 19:19:04.353890443 +0000 UTC m=+51.841088274" Apr 16 19:19:11.262799 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:11.262771 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q4qdv" Apr 16 19:19:13.131796 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.131759 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-6sxn2"] Apr 16 19:19:13.167679 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.167648 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6sxn2"] Apr 16 19:19:13.167830 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.167787 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6sxn2" Apr 16 19:19:13.171575 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.171545 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 19:19:13.171575 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.171546 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-vl99s\"" Apr 16 19:19:13.171782 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.171604 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 19:19:13.172015 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.171994 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 19:19:13.172277 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.172138 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 19:19:13.209966 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.209924 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj4p2\" (UniqueName: \"kubernetes.io/projected/c706952f-9a60-44db-b459-bd44650c58c3-kube-api-access-sj4p2\") pod \"insights-runtime-extractor-6sxn2\" (UID: \"c706952f-9a60-44db-b459-bd44650c58c3\") " pod="openshift-insights/insights-runtime-extractor-6sxn2" Apr 16 19:19:13.209966 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.209968 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c706952f-9a60-44db-b459-bd44650c58c3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6sxn2\" (UID: \"c706952f-9a60-44db-b459-bd44650c58c3\") " pod="openshift-insights/insights-runtime-extractor-6sxn2" Apr 16 19:19:13.210216 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.209996 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c706952f-9a60-44db-b459-bd44650c58c3-crio-socket\") pod \"insights-runtime-extractor-6sxn2\" (UID: \"c706952f-9a60-44db-b459-bd44650c58c3\") " pod="openshift-insights/insights-runtime-extractor-6sxn2" Apr 16 19:19:13.210216 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.210011 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c706952f-9a60-44db-b459-bd44650c58c3-data-volume\") pod \"insights-runtime-extractor-6sxn2\" (UID: \"c706952f-9a60-44db-b459-bd44650c58c3\") " pod="openshift-insights/insights-runtime-extractor-6sxn2" Apr 16 19:19:13.210216 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.210029 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c706952f-9a60-44db-b459-bd44650c58c3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6sxn2\" (UID: \"c706952f-9a60-44db-b459-bd44650c58c3\") " pod="openshift-insights/insights-runtime-extractor-6sxn2" Apr 16 19:19:13.222490 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.222458 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f54jm"] Apr 16 19:19:13.253551 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.253524 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b9749fc78-xrpcs"] Apr 16 19:19:13.253725 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.253694 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f54jm" Apr 16 19:19:13.255961 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.255908 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 19:19:13.256125 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.256109 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-ddddh\"" Apr 16 19:19:13.272398 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.272374 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75dcbb4775-msk2s"] Apr 16 19:19:13.272536 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.272521 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b9749fc78-xrpcs" Apr 16 19:19:13.275539 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.275518 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 19:19:13.275660 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.275587 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 19:19:13.275660 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.275595 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 19:19:13.275919 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.275900 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 19:19:13.275919 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.275912 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 19:19:13.276051 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.275936 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 19:19:13.276051 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.275911 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 19:19:13.291340 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.291315 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f54jm"] Apr 16 19:19:13.291340 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.291342 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75dcbb4775-msk2s"] Apr 16 19:19:13.291340 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.291352 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b9749fc78-xrpcs"] Apr 16 19:19:13.291536 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.291445 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75dcbb4775-msk2s" Apr 16 19:19:13.293570 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.293548 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-dzphs\"" Apr 16 19:19:13.293798 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.293782 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 19:19:13.310671 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.310643 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c706952f-9a60-44db-b459-bd44650c58c3-data-volume\") pod \"insights-runtime-extractor-6sxn2\" (UID: \"c706952f-9a60-44db-b459-bd44650c58c3\") " pod="openshift-insights/insights-runtime-extractor-6sxn2" Apr 16 19:19:13.310827 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.310677 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3177a1ae-d609-460a-b4b1-f6a41125fd25-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-75dcbb4775-msk2s\" (UID: \"3177a1ae-d609-460a-b4b1-f6a41125fd25\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75dcbb4775-msk2s" Apr 16 19:19:13.310827 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.310698 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c706952f-9a60-44db-b459-bd44650c58c3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6sxn2\" (UID: \"c706952f-9a60-44db-b459-bd44650c58c3\") " pod="openshift-insights/insights-runtime-extractor-6sxn2" Apr 16 19:19:13.310827 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.310715 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3dcb10b2-5efe-4644-9b58-27c46c722d34-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6b9749fc78-xrpcs\" (UID: \"3dcb10b2-5efe-4644-9b58-27c46c722d34\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b9749fc78-xrpcs" Apr 16 19:19:13.310827 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.310788 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/3dcb10b2-5efe-4644-9b58-27c46c722d34-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6b9749fc78-xrpcs\" (UID: \"3dcb10b2-5efe-4644-9b58-27c46c722d34\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b9749fc78-xrpcs" Apr 16 19:19:13.310987 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.310841 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/3dcb10b2-5efe-4644-9b58-27c46c722d34-hub\") pod \"cluster-proxy-proxy-agent-6b9749fc78-xrpcs\" (UID: \"3dcb10b2-5efe-4644-9b58-27c46c722d34\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b9749fc78-xrpcs" Apr 16 19:19:13.310987 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.310870 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh6fx\" (UniqueName: \"kubernetes.io/projected/3177a1ae-d609-460a-b4b1-f6a41125fd25-kube-api-access-qh6fx\") pod \"managed-serviceaccount-addon-agent-75dcbb4775-msk2s\" (UID: \"3177a1ae-d609-460a-b4b1-f6a41125fd25\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75dcbb4775-msk2s" Apr 16 19:19:13.310987 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.310913 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c9e5c2e8-1a16-46a7-8cdf-f98fc27ce9a6-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-f54jm\" (UID: \"c9e5c2e8-1a16-46a7-8cdf-f98fc27ce9a6\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f54jm" Apr 16 19:19:13.310987 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.310937 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/3dcb10b2-5efe-4644-9b58-27c46c722d34-ca\") pod \"cluster-proxy-proxy-agent-6b9749fc78-xrpcs\" (UID: \"3dcb10b2-5efe-4644-9b58-27c46c722d34\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b9749fc78-xrpcs" Apr 16 19:19:13.310987 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.310960 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/3dcb10b2-5efe-4644-9b58-27c46c722d34-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6b9749fc78-xrpcs\" (UID: \"3dcb10b2-5efe-4644-9b58-27c46c722d34\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b9749fc78-xrpcs" Apr 16 19:19:13.311250 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.311005 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-878pc\" (UniqueName: \"kubernetes.io/projected/3dcb10b2-5efe-4644-9b58-27c46c722d34-kube-api-access-878pc\") pod \"cluster-proxy-proxy-agent-6b9749fc78-xrpcs\" (UID: \"3dcb10b2-5efe-4644-9b58-27c46c722d34\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b9749fc78-xrpcs" Apr 16 19:19:13.311250 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.311025 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c706952f-9a60-44db-b459-bd44650c58c3-data-volume\") pod \"insights-runtime-extractor-6sxn2\" (UID: \"c706952f-9a60-44db-b459-bd44650c58c3\") " pod="openshift-insights/insights-runtime-extractor-6sxn2" Apr 16 19:19:13.311250 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.311060 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sj4p2\" (UniqueName: \"kubernetes.io/projected/c706952f-9a60-44db-b459-bd44650c58c3-kube-api-access-sj4p2\") pod \"insights-runtime-extractor-6sxn2\" (UID: \"c706952f-9a60-44db-b459-bd44650c58c3\") " pod="openshift-insights/insights-runtime-extractor-6sxn2" Apr 16 19:19:13.311250 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.311099 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c706952f-9a60-44db-b459-bd44650c58c3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6sxn2\" (UID: \"c706952f-9a60-44db-b459-bd44650c58c3\") " pod="openshift-insights/insights-runtime-extractor-6sxn2" Apr 16 19:19:13.311250 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.311148 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c706952f-9a60-44db-b459-bd44650c58c3-crio-socket\") pod \"insights-runtime-extractor-6sxn2\" (UID: \"c706952f-9a60-44db-b459-bd44650c58c3\") " pod="openshift-insights/insights-runtime-extractor-6sxn2" Apr 16 19:19:13.311250 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.311242 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c706952f-9a60-44db-b459-bd44650c58c3-crio-socket\") pod \"insights-runtime-extractor-6sxn2\" (UID: \"c706952f-9a60-44db-b459-bd44650c58c3\") " pod="openshift-insights/insights-runtime-extractor-6sxn2" Apr 16 19:19:13.311657 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.311638 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c706952f-9a60-44db-b459-bd44650c58c3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6sxn2\" (UID: \"c706952f-9a60-44db-b459-bd44650c58c3\") " pod="openshift-insights/insights-runtime-extractor-6sxn2" Apr 16 19:19:13.313081 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.313061 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c706952f-9a60-44db-b459-bd44650c58c3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6sxn2\" (UID: \"c706952f-9a60-44db-b459-bd44650c58c3\") " pod="openshift-insights/insights-runtime-extractor-6sxn2" Apr 16 19:19:13.320729 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.320706 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj4p2\" (UniqueName: \"kubernetes.io/projected/c706952f-9a60-44db-b459-bd44650c58c3-kube-api-access-sj4p2\") pod \"insights-runtime-extractor-6sxn2\" (UID: \"c706952f-9a60-44db-b459-bd44650c58c3\") " pod="openshift-insights/insights-runtime-extractor-6sxn2" Apr 16 19:19:13.411525 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.411430 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c9e5c2e8-1a16-46a7-8cdf-f98fc27ce9a6-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-f54jm\" (UID: \"c9e5c2e8-1a16-46a7-8cdf-f98fc27ce9a6\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f54jm" Apr 16 19:19:13.411525 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.411464 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/3dcb10b2-5efe-4644-9b58-27c46c722d34-ca\") pod \"cluster-proxy-proxy-agent-6b9749fc78-xrpcs\" (UID: \"3dcb10b2-5efe-4644-9b58-27c46c722d34\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b9749fc78-xrpcs" Apr 16 19:19:13.411525 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.411483 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/3dcb10b2-5efe-4644-9b58-27c46c722d34-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6b9749fc78-xrpcs\" (UID: \"3dcb10b2-5efe-4644-9b58-27c46c722d34\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b9749fc78-xrpcs" Apr 16 19:19:13.411807 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.411562 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-878pc\" (UniqueName: \"kubernetes.io/projected/3dcb10b2-5efe-4644-9b58-27c46c722d34-kube-api-access-878pc\") pod \"cluster-proxy-proxy-agent-6b9749fc78-xrpcs\" (UID: \"3dcb10b2-5efe-4644-9b58-27c46c722d34\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b9749fc78-xrpcs" Apr 16 19:19:13.411807 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.411641 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3177a1ae-d609-460a-b4b1-f6a41125fd25-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-75dcbb4775-msk2s\" (UID: \"3177a1ae-d609-460a-b4b1-f6a41125fd25\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75dcbb4775-msk2s" Apr 16 19:19:13.411807 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.411671 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3dcb10b2-5efe-4644-9b58-27c46c722d34-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6b9749fc78-xrpcs\" (UID: \"3dcb10b2-5efe-4644-9b58-27c46c722d34\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b9749fc78-xrpcs" Apr 16 19:19:13.411807 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.411694 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/3dcb10b2-5efe-4644-9b58-27c46c722d34-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6b9749fc78-xrpcs\" (UID: \"3dcb10b2-5efe-4644-9b58-27c46c722d34\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b9749fc78-xrpcs" Apr 16 19:19:13.411807 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.411757 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/3dcb10b2-5efe-4644-9b58-27c46c722d34-hub\") pod \"cluster-proxy-proxy-agent-6b9749fc78-xrpcs\" (UID: \"3dcb10b2-5efe-4644-9b58-27c46c722d34\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b9749fc78-xrpcs" Apr 16 19:19:13.411807 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.411785 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qh6fx\" (UniqueName: \"kubernetes.io/projected/3177a1ae-d609-460a-b4b1-f6a41125fd25-kube-api-access-qh6fx\") pod \"managed-serviceaccount-addon-agent-75dcbb4775-msk2s\" (UID: \"3177a1ae-d609-460a-b4b1-f6a41125fd25\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75dcbb4775-msk2s" Apr 16 19:19:13.412238 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.412212 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/3dcb10b2-5efe-4644-9b58-27c46c722d34-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6b9749fc78-xrpcs\" (UID: \"3dcb10b2-5efe-4644-9b58-27c46c722d34\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b9749fc78-xrpcs" Apr 16 19:19:13.414321 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.414294 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/3dcb10b2-5efe-4644-9b58-27c46c722d34-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6b9749fc78-xrpcs\" (UID: \"3dcb10b2-5efe-4644-9b58-27c46c722d34\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b9749fc78-xrpcs" Apr 16 19:19:13.414425 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.414381 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/3dcb10b2-5efe-4644-9b58-27c46c722d34-ca\") pod \"cluster-proxy-proxy-agent-6b9749fc78-xrpcs\" (UID: \"3dcb10b2-5efe-4644-9b58-27c46c722d34\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b9749fc78-xrpcs" Apr 16 19:19:13.414571 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.414551 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/3dcb10b2-5efe-4644-9b58-27c46c722d34-hub\") pod \"cluster-proxy-proxy-agent-6b9749fc78-xrpcs\" (UID: \"3dcb10b2-5efe-4644-9b58-27c46c722d34\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b9749fc78-xrpcs" Apr 16 19:19:13.414721 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.414700 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c9e5c2e8-1a16-46a7-8cdf-f98fc27ce9a6-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-f54jm\" (UID: \"c9e5c2e8-1a16-46a7-8cdf-f98fc27ce9a6\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f54jm" Apr 16 19:19:13.423675 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.423649 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-878pc\" (UniqueName: \"kubernetes.io/projected/3dcb10b2-5efe-4644-9b58-27c46c722d34-kube-api-access-878pc\") pod \"cluster-proxy-proxy-agent-6b9749fc78-xrpcs\" (UID: \"3dcb10b2-5efe-4644-9b58-27c46c722d34\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b9749fc78-xrpcs" Apr 16 19:19:13.426246 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.426230 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh6fx\" (UniqueName: \"kubernetes.io/projected/3177a1ae-d609-460a-b4b1-f6a41125fd25-kube-api-access-qh6fx\") pod \"managed-serviceaccount-addon-agent-75dcbb4775-msk2s\" (UID: \"3177a1ae-d609-460a-b4b1-f6a41125fd25\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75dcbb4775-msk2s" Apr 16 19:19:13.426307 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.426231 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3177a1ae-d609-460a-b4b1-f6a41125fd25-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-75dcbb4775-msk2s\" (UID: \"3177a1ae-d609-460a-b4b1-f6a41125fd25\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75dcbb4775-msk2s" Apr 16 19:19:13.428866 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.428846 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3dcb10b2-5efe-4644-9b58-27c46c722d34-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6b9749fc78-xrpcs\" (UID: \"3dcb10b2-5efe-4644-9b58-27c46c722d34\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b9749fc78-xrpcs" Apr 16 19:19:13.477790 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.477756 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6sxn2" Apr 16 19:19:13.562571 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.562529 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f54jm" Apr 16 19:19:13.591411 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.591379 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b9749fc78-xrpcs" Apr 16 19:19:13.600240 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.600216 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75dcbb4775-msk2s" Apr 16 19:19:13.603837 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.603791 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6sxn2"] Apr 16 19:19:13.618425 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:19:13.618394 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc706952f_9a60_44db_b459_bd44650c58c3.slice/crio-9ba99c5682cb77cb1506553f8a775a2b0b66dc785252bf74f982c5e62efa9d24 WatchSource:0}: Error finding container 9ba99c5682cb77cb1506553f8a775a2b0b66dc785252bf74f982c5e62efa9d24: Status 404 returned error can't find the container with id 9ba99c5682cb77cb1506553f8a775a2b0b66dc785252bf74f982c5e62efa9d24 Apr 16 19:19:13.707600 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.706970 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f54jm"] Apr 16 19:19:13.709484 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:19:13.709415 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9e5c2e8_1a16_46a7_8cdf_f98fc27ce9a6.slice/crio-e56207a9cea0b6026e06f976721c26eb91642e68f27a204a8862116e784c57a7 WatchSource:0}: Error finding container e56207a9cea0b6026e06f976721c26eb91642e68f27a204a8862116e784c57a7: Status 404 returned error can't find the container with id e56207a9cea0b6026e06f976721c26eb91642e68f27a204a8862116e784c57a7 Apr 16 19:19:13.736952 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.736924 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b9749fc78-xrpcs"] Apr 16 19:19:13.740358 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:19:13.740323 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3dcb10b2_5efe_4644_9b58_27c46c722d34.slice/crio-fb9c614ef1e8a990f702f2e1f5f5887ce75c4a1765c9d94ea15da3f9782f753c WatchSource:0}: Error finding container fb9c614ef1e8a990f702f2e1f5f5887ce75c4a1765c9d94ea15da3f9782f753c: Status 404 returned error can't find the container with id fb9c614ef1e8a990f702f2e1f5f5887ce75c4a1765c9d94ea15da3f9782f753c Apr 16 19:19:13.750091 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:13.750060 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75dcbb4775-msk2s"] Apr 16 19:19:13.759732 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:19:13.759705 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3177a1ae_d609_460a_b4b1_f6a41125fd25.slice/crio-b7d202edc3e1349d7def9e072183e04b4472e50a3c188dbcdfa34e5d92ed34ab WatchSource:0}: Error finding container b7d202edc3e1349d7def9e072183e04b4472e50a3c188dbcdfa34e5d92ed34ab: Status 404 returned error can't find the container with id b7d202edc3e1349d7def9e072183e04b4472e50a3c188dbcdfa34e5d92ed34ab Apr 16 19:19:14.328864 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:14.328831 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-2pfdm" Apr 16 19:19:14.374167 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:14.374107 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75dcbb4775-msk2s" event={"ID":"3177a1ae-d609-460a-b4b1-f6a41125fd25","Type":"ContainerStarted","Data":"b7d202edc3e1349d7def9e072183e04b4472e50a3c188dbcdfa34e5d92ed34ab"} Apr 16 19:19:14.380871 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:14.380771 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6sxn2" event={"ID":"c706952f-9a60-44db-b459-bd44650c58c3","Type":"ContainerStarted","Data":"4abb112963bc532053c5eaddfb48d9b97bfa02579b5554af3de0341512b00730"} Apr 16 19:19:14.380871 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:14.380814 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6sxn2" event={"ID":"c706952f-9a60-44db-b459-bd44650c58c3","Type":"ContainerStarted","Data":"9ba99c5682cb77cb1506553f8a775a2b0b66dc785252bf74f982c5e62efa9d24"} Apr 16 19:19:14.383547 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:14.383515 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f54jm" event={"ID":"c9e5c2e8-1a16-46a7-8cdf-f98fc27ce9a6","Type":"ContainerStarted","Data":"e56207a9cea0b6026e06f976721c26eb91642e68f27a204a8862116e784c57a7"} Apr 16 19:19:14.385937 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:14.385872 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b9749fc78-xrpcs" event={"ID":"3dcb10b2-5efe-4644-9b58-27c46c722d34","Type":"ContainerStarted","Data":"fb9c614ef1e8a990f702f2e1f5f5887ce75c4a1765c9d94ea15da3f9782f753c"} Apr 16 19:19:17.397077 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:17.396975 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b9749fc78-xrpcs" event={"ID":"3dcb10b2-5efe-4644-9b58-27c46c722d34","Type":"ContainerStarted","Data":"f32e4359dae75d9cd7a27cd4ea4394a8477cd045cfdc8d6c23cfa9a99bb6a8cd"} Apr 16 19:19:17.398513 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:17.398464 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75dcbb4775-msk2s" event={"ID":"3177a1ae-d609-460a-b4b1-f6a41125fd25","Type":"ContainerStarted","Data":"c2a2f78f5b0ad723ebe6383225b4679b66ceba73244e3a36b2b982f37966719d"} Apr 16 19:19:17.400375 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:17.400338 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6sxn2" event={"ID":"c706952f-9a60-44db-b459-bd44650c58c3","Type":"ContainerStarted","Data":"70b5f9bc18ecf0969346668a589bb86ca7b64b93e0c84fb3469a1450a5107a87"} Apr 16 19:19:17.401697 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:17.401674 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f54jm" event={"ID":"c9e5c2e8-1a16-46a7-8cdf-f98fc27ce9a6","Type":"ContainerStarted","Data":"af39d58eedcb2f8a8cab2f501a34e05de87bfed5ab9f4c3e9eca8c83532e8c35"} Apr 16 19:19:17.401914 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:17.401888 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f54jm" Apr 16 19:19:17.407513 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:17.407491 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f54jm" Apr 16 19:19:17.414206 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:17.414153 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75dcbb4775-msk2s" podStartSLOduration=1.027500325 podStartE2EDuration="4.414138802s" podCreationTimestamp="2026-04-16 19:19:13 +0000 UTC" firstStartedPulling="2026-04-16 19:19:13.76160686 +0000 UTC m=+61.248804670" lastFinishedPulling="2026-04-16 19:19:17.148245333 +0000 UTC m=+64.635443147" observedRunningTime="2026-04-16 19:19:17.413518087 +0000 UTC m=+64.900715911" watchObservedRunningTime="2026-04-16 19:19:17.414138802 +0000 UTC m=+64.901336635" Apr 16 19:19:17.428104 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:17.428060 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-f54jm" podStartSLOduration=0.999716781 podStartE2EDuration="4.428044112s" podCreationTimestamp="2026-04-16 19:19:13 +0000 UTC" firstStartedPulling="2026-04-16 19:19:13.71192857 +0000 UTC m=+61.199126381" lastFinishedPulling="2026-04-16 19:19:17.140255899 +0000 UTC m=+64.627453712" observedRunningTime="2026-04-16 19:19:17.427421953 +0000 UTC m=+64.914619785" watchObservedRunningTime="2026-04-16 19:19:17.428044112 +0000 UTC m=+64.915241922" Apr 16 19:19:18.406834 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:18.406739 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6sxn2" event={"ID":"c706952f-9a60-44db-b459-bd44650c58c3","Type":"ContainerStarted","Data":"073dc5784bcd6bf5877229baf02a2326c7dea2f85ba8d3e3a44c2cc9155a3b96"} Apr 16 19:19:18.425905 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:18.425852 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-6sxn2" podStartSLOduration=1.106872974 podStartE2EDuration="5.425832844s" podCreationTimestamp="2026-04-16 19:19:13 +0000 UTC" firstStartedPulling="2026-04-16 19:19:13.773035515 +0000 UTC m=+61.260233325" lastFinishedPulling="2026-04-16 19:19:18.091995385 +0000 UTC m=+65.579193195" observedRunningTime="2026-04-16 19:19:18.42516985 +0000 UTC m=+65.912367681" watchObservedRunningTime="2026-04-16 19:19:18.425832844 +0000 UTC m=+65.913030677" Apr 16 19:19:18.748447 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:18.748412 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c79c97f4-34fe-4b2b-9f22-401688c77d79-metrics-certs\") pod \"network-metrics-daemon-hj7p9\" (UID: \"c79c97f4-34fe-4b2b-9f22-401688c77d79\") " pod="openshift-multus/network-metrics-daemon-hj7p9" Apr 16 19:19:18.751380 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:18.751353 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 19:19:18.762070 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:18.762036 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c79c97f4-34fe-4b2b-9f22-401688c77d79-metrics-certs\") pod \"network-metrics-daemon-hj7p9\" (UID: \"c79c97f4-34fe-4b2b-9f22-401688c77d79\") " pod="openshift-multus/network-metrics-daemon-hj7p9" Apr 16 19:19:18.896982 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:18.896946 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-t5lg8\"" Apr 16 19:19:18.904934 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:18.904898 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hj7p9" Apr 16 19:19:19.034734 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:19.034655 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hj7p9"] Apr 16 19:19:19.479752 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:19:19.479717 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc79c97f4_34fe_4b2b_9f22_401688c77d79.slice/crio-55bdc3fdd85d927892b778b0d5a75136f33f59268b996a6ef1e1e79b4dee4807 WatchSource:0}: Error finding container 55bdc3fdd85d927892b778b0d5a75136f33f59268b996a6ef1e1e79b4dee4807: Status 404 returned error can't find the container with id 55bdc3fdd85d927892b778b0d5a75136f33f59268b996a6ef1e1e79b4dee4807 Apr 16 19:19:20.415780 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:20.415736 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b9749fc78-xrpcs" event={"ID":"3dcb10b2-5efe-4644-9b58-27c46c722d34","Type":"ContainerStarted","Data":"faf217b79e7b211e2f033d82074a169f320eb73b0e2a976026047dbf8edecfbc"} Apr 16 19:19:20.415780 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:20.415785 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b9749fc78-xrpcs" event={"ID":"3dcb10b2-5efe-4644-9b58-27c46c722d34","Type":"ContainerStarted","Data":"38b7cc93fa50681a8199c58ccb53bf5dce3755d9aaf2701df18eac2686e69871"} Apr 16 19:19:20.416886 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:20.416858 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hj7p9" event={"ID":"c79c97f4-34fe-4b2b-9f22-401688c77d79","Type":"ContainerStarted","Data":"55bdc3fdd85d927892b778b0d5a75136f33f59268b996a6ef1e1e79b4dee4807"} Apr 16 19:19:20.439203 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:20.439125 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b9749fc78-xrpcs" podStartSLOduration=1.6454847209999999 podStartE2EDuration="7.439106876s" podCreationTimestamp="2026-04-16 19:19:13 +0000 UTC" firstStartedPulling="2026-04-16 19:19:13.742059117 +0000 UTC m=+61.229256928" lastFinishedPulling="2026-04-16 19:19:19.535681263 +0000 UTC m=+67.022879083" observedRunningTime="2026-04-16 19:19:20.438081485 +0000 UTC m=+67.925279316" watchObservedRunningTime="2026-04-16 19:19:20.439106876 +0000 UTC m=+67.926304712" Apr 16 19:19:21.421347 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:21.421315 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hj7p9" event={"ID":"c79c97f4-34fe-4b2b-9f22-401688c77d79","Type":"ContainerStarted","Data":"7e0c9c9700cc6cd51155c38d10a4fb6cc8fb7beb073471d8456a1023fcaf8cf5"} Apr 16 19:19:21.421347 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:21.421351 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hj7p9" event={"ID":"c79c97f4-34fe-4b2b-9f22-401688c77d79","Type":"ContainerStarted","Data":"e5cccf8cefd7303f0f27ec24b7c582cf76a5eeffc577d601a7f500423b426baa"} Apr 16 19:19:21.440841 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:21.440782 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-hj7p9" podStartSLOduration=67.34722169 podStartE2EDuration="1m8.440763024s" podCreationTimestamp="2026-04-16 19:18:13 +0000 UTC" firstStartedPulling="2026-04-16 19:19:19.481587426 +0000 UTC m=+66.968785250" lastFinishedPulling="2026-04-16 19:19:20.575128771 +0000 UTC m=+68.062326584" observedRunningTime="2026-04-16 19:19:21.440619245 +0000 UTC m=+68.927817078" watchObservedRunningTime="2026-04-16 19:19:21.440763024 +0000 UTC m=+68.927960859" Apr 16 19:19:24.283031 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.282900 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-mql76"] Apr 16 19:19:24.286367 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.286347 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-mql76" Apr 16 19:19:24.291420 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.291081 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 19:19:24.291420 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.291137 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 19:19:24.291420 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.291164 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-jt2bw\"" Apr 16 19:19:24.291420 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.291273 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 19:19:24.291420 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.291312 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 19:19:24.291719 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.291534 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 19:19:24.291719 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.291568 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 19:19:24.296856 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.296838 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-zsg8r" Apr 16 19:19:24.304567 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.304546 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-mql76"] Apr 16 19:19:24.314832 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.314811 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-mhtzt"] Apr 16 19:19:24.318513 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.318492 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-mhtzt" Apr 16 19:19:24.321276 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.321243 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-9z59t\"" Apr 16 19:19:24.322011 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.321771 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 19:19:24.322011 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.321852 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 19:19:24.322168 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.322091 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 19:19:24.386317 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.386280 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e39342d6-d92b-4524-a119-2c56664bbc27-root\") pod \"node-exporter-mhtzt\" (UID: \"e39342d6-d92b-4524-a119-2c56664bbc27\") " pod="openshift-monitoring/node-exporter-mhtzt" Apr 16 19:19:24.386503 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.386326 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e39342d6-d92b-4524-a119-2c56664bbc27-node-exporter-tls\") pod \"node-exporter-mhtzt\" (UID: \"e39342d6-d92b-4524-a119-2c56664bbc27\") " pod="openshift-monitoring/node-exporter-mhtzt" Apr 16 19:19:24.386503 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.386355 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f792fa48-e0c2-4512-890b-752dbfc3ceaf-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-mql76\" (UID: \"f792fa48-e0c2-4512-890b-752dbfc3ceaf\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mql76" Apr 16 19:19:24.386503 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.386466 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e39342d6-d92b-4524-a119-2c56664bbc27-node-exporter-textfile\") pod \"node-exporter-mhtzt\" (UID: \"e39342d6-d92b-4524-a119-2c56664bbc27\") " pod="openshift-monitoring/node-exporter-mhtzt" Apr 16 19:19:24.386662 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.386510 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrrfz\" (UniqueName: \"kubernetes.io/projected/f792fa48-e0c2-4512-890b-752dbfc3ceaf-kube-api-access-nrrfz\") pod \"kube-state-metrics-69db897b98-mql76\" (UID: \"f792fa48-e0c2-4512-890b-752dbfc3ceaf\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mql76" Apr 16 19:19:24.386662 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.386567 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e39342d6-d92b-4524-a119-2c56664bbc27-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-mhtzt\" (UID: \"e39342d6-d92b-4524-a119-2c56664bbc27\") " pod="openshift-monitoring/node-exporter-mhtzt" Apr 16 19:19:24.386662 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.386615 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f792fa48-e0c2-4512-890b-752dbfc3ceaf-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-mql76\" (UID: \"f792fa48-e0c2-4512-890b-752dbfc3ceaf\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mql76" Apr 16 19:19:24.386662 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.386649 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e39342d6-d92b-4524-a119-2c56664bbc27-metrics-client-ca\") pod \"node-exporter-mhtzt\" (UID: \"e39342d6-d92b-4524-a119-2c56664bbc27\") " pod="openshift-monitoring/node-exporter-mhtzt" Apr 16 19:19:24.386848 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.386703 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d964b\" (UniqueName: \"kubernetes.io/projected/e39342d6-d92b-4524-a119-2c56664bbc27-kube-api-access-d964b\") pod \"node-exporter-mhtzt\" (UID: \"e39342d6-d92b-4524-a119-2c56664bbc27\") " pod="openshift-monitoring/node-exporter-mhtzt" Apr 16 19:19:24.386848 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.386735 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e39342d6-d92b-4524-a119-2c56664bbc27-node-exporter-accelerators-collector-config\") pod \"node-exporter-mhtzt\" (UID: \"e39342d6-d92b-4524-a119-2c56664bbc27\") " pod="openshift-monitoring/node-exporter-mhtzt" Apr 16 19:19:24.386848 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.386785 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f792fa48-e0c2-4512-890b-752dbfc3ceaf-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-mql76\" (UID: \"f792fa48-e0c2-4512-890b-752dbfc3ceaf\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mql76" Apr 16 19:19:24.386848 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.386814 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e39342d6-d92b-4524-a119-2c56664bbc27-sys\") pod \"node-exporter-mhtzt\" (UID: \"e39342d6-d92b-4524-a119-2c56664bbc27\") " pod="openshift-monitoring/node-exporter-mhtzt" Apr 16 19:19:24.387034 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.386856 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f792fa48-e0c2-4512-890b-752dbfc3ceaf-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-mql76\" (UID: \"f792fa48-e0c2-4512-890b-752dbfc3ceaf\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mql76" Apr 16 19:19:24.387034 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.386885 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e39342d6-d92b-4524-a119-2c56664bbc27-node-exporter-wtmp\") pod \"node-exporter-mhtzt\" (UID: \"e39342d6-d92b-4524-a119-2c56664bbc27\") " pod="openshift-monitoring/node-exporter-mhtzt" Apr 16 19:19:24.387034 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.386926 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f792fa48-e0c2-4512-890b-752dbfc3ceaf-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-mql76\" (UID: \"f792fa48-e0c2-4512-890b-752dbfc3ceaf\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mql76" Apr 16 19:19:24.487718 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.487678 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e39342d6-d92b-4524-a119-2c56664bbc27-node-exporter-tls\") pod \"node-exporter-mhtzt\" (UID: \"e39342d6-d92b-4524-a119-2c56664bbc27\") " pod="openshift-monitoring/node-exporter-mhtzt" Apr 16 19:19:24.487905 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.487725 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f792fa48-e0c2-4512-890b-752dbfc3ceaf-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-mql76\" (UID: \"f792fa48-e0c2-4512-890b-752dbfc3ceaf\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mql76" Apr 16 19:19:24.487905 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.487778 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e39342d6-d92b-4524-a119-2c56664bbc27-node-exporter-textfile\") pod \"node-exporter-mhtzt\" (UID: \"e39342d6-d92b-4524-a119-2c56664bbc27\") " pod="openshift-monitoring/node-exporter-mhtzt" Apr 16 19:19:24.487905 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.487814 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nrrfz\" (UniqueName: \"kubernetes.io/projected/f792fa48-e0c2-4512-890b-752dbfc3ceaf-kube-api-access-nrrfz\") pod \"kube-state-metrics-69db897b98-mql76\" (UID: \"f792fa48-e0c2-4512-890b-752dbfc3ceaf\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mql76" Apr 16 19:19:24.487905 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.487851 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e39342d6-d92b-4524-a119-2c56664bbc27-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-mhtzt\" (UID: \"e39342d6-d92b-4524-a119-2c56664bbc27\") " pod="openshift-monitoring/node-exporter-mhtzt" Apr 16 19:19:24.487905 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.487878 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f792fa48-e0c2-4512-890b-752dbfc3ceaf-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-mql76\" (UID: \"f792fa48-e0c2-4512-890b-752dbfc3ceaf\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mql76" Apr 16 19:19:24.488167 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.487918 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e39342d6-d92b-4524-a119-2c56664bbc27-metrics-client-ca\") pod \"node-exporter-mhtzt\" (UID: \"e39342d6-d92b-4524-a119-2c56664bbc27\") " pod="openshift-monitoring/node-exporter-mhtzt" Apr 16 19:19:24.488167 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.487952 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d964b\" (UniqueName: \"kubernetes.io/projected/e39342d6-d92b-4524-a119-2c56664bbc27-kube-api-access-d964b\") pod \"node-exporter-mhtzt\" (UID: \"e39342d6-d92b-4524-a119-2c56664bbc27\") " pod="openshift-monitoring/node-exporter-mhtzt" Apr 16 19:19:24.488167 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.487978 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e39342d6-d92b-4524-a119-2c56664bbc27-node-exporter-accelerators-collector-config\") pod \"node-exporter-mhtzt\" (UID: \"e39342d6-d92b-4524-a119-2c56664bbc27\") " pod="openshift-monitoring/node-exporter-mhtzt" Apr 16 19:19:24.488167 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.488018 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f792fa48-e0c2-4512-890b-752dbfc3ceaf-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-mql76\" (UID: \"f792fa48-e0c2-4512-890b-752dbfc3ceaf\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mql76" Apr 16 19:19:24.488167 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.488045 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e39342d6-d92b-4524-a119-2c56664bbc27-sys\") pod \"node-exporter-mhtzt\" (UID: \"e39342d6-d92b-4524-a119-2c56664bbc27\") " pod="openshift-monitoring/node-exporter-mhtzt" Apr 16 19:19:24.488167 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.488073 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f792fa48-e0c2-4512-890b-752dbfc3ceaf-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-mql76\" (UID: \"f792fa48-e0c2-4512-890b-752dbfc3ceaf\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mql76" Apr 16 19:19:24.488167 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.488099 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e39342d6-d92b-4524-a119-2c56664bbc27-node-exporter-wtmp\") pod \"node-exporter-mhtzt\" (UID: \"e39342d6-d92b-4524-a119-2c56664bbc27\") " pod="openshift-monitoring/node-exporter-mhtzt" Apr 16 19:19:24.488167 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.488123 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f792fa48-e0c2-4512-890b-752dbfc3ceaf-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-mql76\" (UID: \"f792fa48-e0c2-4512-890b-752dbfc3ceaf\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mql76" Apr 16 19:19:24.488579 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.488178 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e39342d6-d92b-4524-a119-2c56664bbc27-root\") pod \"node-exporter-mhtzt\" (UID: \"e39342d6-d92b-4524-a119-2c56664bbc27\") " pod="openshift-monitoring/node-exporter-mhtzt" Apr 16 19:19:24.488579 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.488214 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e39342d6-d92b-4524-a119-2c56664bbc27-node-exporter-textfile\") pod \"node-exporter-mhtzt\" (UID: \"e39342d6-d92b-4524-a119-2c56664bbc27\") " pod="openshift-monitoring/node-exporter-mhtzt" Apr 16 19:19:24.488579 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.488282 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e39342d6-d92b-4524-a119-2c56664bbc27-root\") pod \"node-exporter-mhtzt\" (UID: \"e39342d6-d92b-4524-a119-2c56664bbc27\") " pod="openshift-monitoring/node-exporter-mhtzt" Apr 16 19:19:24.488579 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.488340 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e39342d6-d92b-4524-a119-2c56664bbc27-sys\") pod \"node-exporter-mhtzt\" (UID: \"e39342d6-d92b-4524-a119-2c56664bbc27\") " pod="openshift-monitoring/node-exporter-mhtzt" Apr 16 19:19:24.488579 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.488535 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e39342d6-d92b-4524-a119-2c56664bbc27-metrics-client-ca\") pod \"node-exporter-mhtzt\" (UID: \"e39342d6-d92b-4524-a119-2c56664bbc27\") " pod="openshift-monitoring/node-exporter-mhtzt" Apr 16 19:19:24.488579 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:19:24.488562 2580 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 16 19:19:24.488878 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:19:24.488618 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f792fa48-e0c2-4512-890b-752dbfc3ceaf-kube-state-metrics-tls podName:f792fa48-e0c2-4512-890b-752dbfc3ceaf nodeName:}" failed. No retries permitted until 2026-04-16 19:19:24.988599573 +0000 UTC m=+72.475797398 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/f792fa48-e0c2-4512-890b-752dbfc3ceaf-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-mql76" (UID: "f792fa48-e0c2-4512-890b-752dbfc3ceaf") : secret "kube-state-metrics-tls" not found Apr 16 19:19:24.488878 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.488640 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e39342d6-d92b-4524-a119-2c56664bbc27-node-exporter-wtmp\") pod \"node-exporter-mhtzt\" (UID: \"e39342d6-d92b-4524-a119-2c56664bbc27\") " pod="openshift-monitoring/node-exporter-mhtzt" Apr 16 19:19:24.489034 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.489014 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e39342d6-d92b-4524-a119-2c56664bbc27-node-exporter-accelerators-collector-config\") pod \"node-exporter-mhtzt\" (UID: \"e39342d6-d92b-4524-a119-2c56664bbc27\") " pod="openshift-monitoring/node-exporter-mhtzt" Apr 16 19:19:24.489270 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.489243 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f792fa48-e0c2-4512-890b-752dbfc3ceaf-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-mql76\" (UID: \"f792fa48-e0c2-4512-890b-752dbfc3ceaf\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mql76" Apr 16 19:19:24.489512 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.489490 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f792fa48-e0c2-4512-890b-752dbfc3ceaf-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-mql76\" (UID: \"f792fa48-e0c2-4512-890b-752dbfc3ceaf\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mql76" Apr 16 19:19:24.489919 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.489896 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f792fa48-e0c2-4512-890b-752dbfc3ceaf-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-mql76\" (UID: \"f792fa48-e0c2-4512-890b-752dbfc3ceaf\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mql76" Apr 16 19:19:24.490405 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.490388 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e39342d6-d92b-4524-a119-2c56664bbc27-node-exporter-tls\") pod \"node-exporter-mhtzt\" (UID: \"e39342d6-d92b-4524-a119-2c56664bbc27\") " pod="openshift-monitoring/node-exporter-mhtzt" Apr 16 19:19:24.490750 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.490733 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e39342d6-d92b-4524-a119-2c56664bbc27-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-mhtzt\" (UID: \"e39342d6-d92b-4524-a119-2c56664bbc27\") " pod="openshift-monitoring/node-exporter-mhtzt" Apr 16 19:19:24.491317 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.491295 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f792fa48-e0c2-4512-890b-752dbfc3ceaf-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-mql76\" (UID: \"f792fa48-e0c2-4512-890b-752dbfc3ceaf\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mql76" Apr 16 19:19:24.505143 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.501122 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d964b\" (UniqueName: \"kubernetes.io/projected/e39342d6-d92b-4524-a119-2c56664bbc27-kube-api-access-d964b\") pod \"node-exporter-mhtzt\" (UID: \"e39342d6-d92b-4524-a119-2c56664bbc27\") " pod="openshift-monitoring/node-exporter-mhtzt" Apr 16 19:19:24.505143 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.501492 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrrfz\" (UniqueName: \"kubernetes.io/projected/f792fa48-e0c2-4512-890b-752dbfc3ceaf-kube-api-access-nrrfz\") pod \"kube-state-metrics-69db897b98-mql76\" (UID: \"f792fa48-e0c2-4512-890b-752dbfc3ceaf\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mql76" Apr 16 19:19:24.629144 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.629059 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-mhtzt" Apr 16 19:19:24.637128 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:19:24.637092 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode39342d6_d92b_4524_a119_2c56664bbc27.slice/crio-26344584c9f58cfe51c2d3ce98561f19039e96bc94c00aa085a612e1d4419f0d WatchSource:0}: Error finding container 26344584c9f58cfe51c2d3ce98561f19039e96bc94c00aa085a612e1d4419f0d: Status 404 returned error can't find the container with id 26344584c9f58cfe51c2d3ce98561f19039e96bc94c00aa085a612e1d4419f0d Apr 16 19:19:24.991746 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.991707 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f792fa48-e0c2-4512-890b-752dbfc3ceaf-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-mql76\" (UID: \"f792fa48-e0c2-4512-890b-752dbfc3ceaf\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mql76" Apr 16 19:19:24.994006 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:24.993987 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f792fa48-e0c2-4512-890b-752dbfc3ceaf-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-mql76\" (UID: \"f792fa48-e0c2-4512-890b-752dbfc3ceaf\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mql76" Apr 16 19:19:25.198575 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:25.198538 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-mql76" Apr 16 19:19:25.339507 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:25.339484 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-mql76"] Apr 16 19:19:25.342104 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:19:25.342071 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf792fa48_e0c2_4512_890b_752dbfc3ceaf.slice/crio-126f3c8bd39470772672f143e1a010296a855ebdc91b0e31af9ba6e07808a672 WatchSource:0}: Error finding container 126f3c8bd39470772672f143e1a010296a855ebdc91b0e31af9ba6e07808a672: Status 404 returned error can't find the container with id 126f3c8bd39470772672f143e1a010296a855ebdc91b0e31af9ba6e07808a672 Apr 16 19:19:25.435428 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:25.435388 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mhtzt" event={"ID":"e39342d6-d92b-4524-a119-2c56664bbc27","Type":"ContainerStarted","Data":"26344584c9f58cfe51c2d3ce98561f19039e96bc94c00aa085a612e1d4419f0d"} Apr 16 19:19:25.436603 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:25.436573 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-mql76" event={"ID":"f792fa48-e0c2-4512-890b-752dbfc3ceaf","Type":"ContainerStarted","Data":"126f3c8bd39470772672f143e1a010296a855ebdc91b0e31af9ba6e07808a672"} Apr 16 19:19:26.440671 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:26.440630 2580 generic.go:358] "Generic (PLEG): container finished" podID="e39342d6-d92b-4524-a119-2c56664bbc27" containerID="ce056bb305f125ece4bda49078334a267bc9397e4a563e57148474b16205c460" exitCode=0 Apr 16 19:19:26.441062 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:26.440686 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mhtzt" event={"ID":"e39342d6-d92b-4524-a119-2c56664bbc27","Type":"ContainerDied","Data":"ce056bb305f125ece4bda49078334a267bc9397e4a563e57148474b16205c460"} Apr 16 19:19:27.445994 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:27.445946 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mhtzt" event={"ID":"e39342d6-d92b-4524-a119-2c56664bbc27","Type":"ContainerStarted","Data":"0970adeebf43565e65e91a9c4f245e47aa2a726f7711b36cb0dc5b3dc50b189e"} Apr 16 19:19:27.445994 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:27.445999 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mhtzt" event={"ID":"e39342d6-d92b-4524-a119-2c56664bbc27","Type":"ContainerStarted","Data":"9b79fe2f529ea5cfac58ba9ecb9cb3d85a9389b5fa1cf95cba47482e90a9c235"} Apr 16 19:19:27.448008 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:27.447976 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-mql76" event={"ID":"f792fa48-e0c2-4512-890b-752dbfc3ceaf","Type":"ContainerStarted","Data":"c331210b2b390ed1c87e09c9f2f291588c65dcbbe11a1413143613fb7894b7e4"} Apr 16 19:19:27.448144 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:27.448013 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-mql76" event={"ID":"f792fa48-e0c2-4512-890b-752dbfc3ceaf","Type":"ContainerStarted","Data":"1de75e7fa65a3308d10cd2808623f0b14529db67bdd9bdb83eb63f2d7a184f37"} Apr 16 19:19:27.448144 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:27.448028 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-mql76" event={"ID":"f792fa48-e0c2-4512-890b-752dbfc3ceaf","Type":"ContainerStarted","Data":"a1a4d6dc124e1926a23238888fdd1c9c22bf32e653bbd23e575a9edfa71cc4b4"} Apr 16 19:19:27.467077 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:27.467017 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-mhtzt" podStartSLOduration=2.561875784 podStartE2EDuration="3.466998318s" podCreationTimestamp="2026-04-16 19:19:24 +0000 UTC" firstStartedPulling="2026-04-16 19:19:24.639129004 +0000 UTC m=+72.126326831" lastFinishedPulling="2026-04-16 19:19:25.544251555 +0000 UTC m=+73.031449365" observedRunningTime="2026-04-16 19:19:27.46662322 +0000 UTC m=+74.953821053" watchObservedRunningTime="2026-04-16 19:19:27.466998318 +0000 UTC m=+74.954196148" Apr 16 19:19:27.484668 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:27.484604 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-mql76" podStartSLOduration=1.860444 podStartE2EDuration="3.484583803s" podCreationTimestamp="2026-04-16 19:19:24 +0000 UTC" firstStartedPulling="2026-04-16 19:19:25.343897669 +0000 UTC m=+72.831095483" lastFinishedPulling="2026-04-16 19:19:26.968037473 +0000 UTC m=+74.455235286" observedRunningTime="2026-04-16 19:19:27.482819447 +0000 UTC m=+74.970017279" watchObservedRunningTime="2026-04-16 19:19:27.484583803 +0000 UTC m=+74.971781636" Apr 16 19:19:29.028244 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:29.028204 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-bv259"] Apr 16 19:19:29.030995 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:29.030978 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bv259" Apr 16 19:19:29.033487 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:29.033465 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-p97dl\"" Apr 16 19:19:29.033841 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:29.033828 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 19:19:29.040783 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:29.040763 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-bv259"] Apr 16 19:19:29.118875 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:29.118845 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5cd597be-9435-451a-9f7c-11d341f570a5-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-bv259\" (UID: \"5cd597be-9435-451a-9f7c-11d341f570a5\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bv259" Apr 16 19:19:29.219577 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:29.219543 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5cd597be-9435-451a-9f7c-11d341f570a5-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-bv259\" (UID: \"5cd597be-9435-451a-9f7c-11d341f570a5\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bv259" Apr 16 19:19:29.221956 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:29.221927 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5cd597be-9435-451a-9f7c-11d341f570a5-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-bv259\" (UID: \"5cd597be-9435-451a-9f7c-11d341f570a5\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bv259" Apr 16 19:19:29.342371 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:29.342286 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bv259" Apr 16 19:19:29.468810 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:29.468777 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-bv259"] Apr 16 19:19:29.473278 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:19:29.473251 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cd597be_9435_451a_9f7c_11d341f570a5.slice/crio-5ff78aaee3e2b7c28e34e19d1cc8faf54f6eb28ae9bf49a39bc27e4c52093d82 WatchSource:0}: Error finding container 5ff78aaee3e2b7c28e34e19d1cc8faf54f6eb28ae9bf49a39bc27e4c52093d82: Status 404 returned error can't find the container with id 5ff78aaee3e2b7c28e34e19d1cc8faf54f6eb28ae9bf49a39bc27e4c52093d82 Apr 16 19:19:30.458806 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:30.458768 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bv259" event={"ID":"5cd597be-9435-451a-9f7c-11d341f570a5","Type":"ContainerStarted","Data":"5ff78aaee3e2b7c28e34e19d1cc8faf54f6eb28ae9bf49a39bc27e4c52093d82"} Apr 16 19:19:31.463045 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:31.463003 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bv259" event={"ID":"5cd597be-9435-451a-9f7c-11d341f570a5","Type":"ContainerStarted","Data":"38ab9544cda173f9fe0484fbe9e32b8ab800822aee8176fb042aa68b0fe7ab1b"} Apr 16 19:19:31.463453 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:31.463248 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bv259" Apr 16 19:19:31.467660 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:31.467637 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bv259" Apr 16 19:19:31.482592 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:31.482541 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bv259" podStartSLOduration=1.117198645 podStartE2EDuration="2.482528162s" podCreationTimestamp="2026-04-16 19:19:29 +0000 UTC" firstStartedPulling="2026-04-16 19:19:29.475201063 +0000 UTC m=+76.962398886" lastFinishedPulling="2026-04-16 19:19:30.840530594 +0000 UTC m=+78.327728403" observedRunningTime="2026-04-16 19:19:31.481842259 +0000 UTC m=+78.969040089" watchObservedRunningTime="2026-04-16 19:19:31.482528162 +0000 UTC m=+78.969725998" Apr 16 19:19:32.952154 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:32.952117 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-5lqq4"] Apr 16 19:19:32.956362 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:32.956339 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-5lqq4" Apr 16 19:19:32.958802 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:32.958774 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 19:19:32.958925 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:32.958809 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-5m89g\"" Apr 16 19:19:32.958925 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:32.958858 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 19:19:32.964848 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:32.964828 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-5lqq4"] Apr 16 19:19:33.050505 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:33.050470 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhvxw\" (UniqueName: \"kubernetes.io/projected/ce85330e-a89a-4e63-b5d1-7af0e5319b88-kube-api-access-zhvxw\") pod \"downloads-6bcc868b7-5lqq4\" (UID: \"ce85330e-a89a-4e63-b5d1-7af0e5319b88\") " pod="openshift-console/downloads-6bcc868b7-5lqq4" Apr 16 19:19:33.151080 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:33.151050 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhvxw\" (UniqueName: \"kubernetes.io/projected/ce85330e-a89a-4e63-b5d1-7af0e5319b88-kube-api-access-zhvxw\") pod \"downloads-6bcc868b7-5lqq4\" (UID: \"ce85330e-a89a-4e63-b5d1-7af0e5319b88\") " pod="openshift-console/downloads-6bcc868b7-5lqq4" Apr 16 19:19:33.160918 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:33.160894 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhvxw\" (UniqueName: \"kubernetes.io/projected/ce85330e-a89a-4e63-b5d1-7af0e5319b88-kube-api-access-zhvxw\") pod \"downloads-6bcc868b7-5lqq4\" (UID: \"ce85330e-a89a-4e63-b5d1-7af0e5319b88\") " pod="openshift-console/downloads-6bcc868b7-5lqq4" Apr 16 19:19:33.266735 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:33.266636 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-5lqq4" Apr 16 19:19:33.388875 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:33.388835 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-5lqq4"] Apr 16 19:19:33.391764 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:19:33.391726 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce85330e_a89a_4e63_b5d1_7af0e5319b88.slice/crio-57034a1f99e9ca8d5181a10031f674c615ea73b24b6f5d9c8af33805f5057057 WatchSource:0}: Error finding container 57034a1f99e9ca8d5181a10031f674c615ea73b24b6f5d9c8af33805f5057057: Status 404 returned error can't find the container with id 57034a1f99e9ca8d5181a10031f674c615ea73b24b6f5d9c8af33805f5057057 Apr 16 19:19:33.469570 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:33.469533 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-5lqq4" event={"ID":"ce85330e-a89a-4e63-b5d1-7af0e5319b88","Type":"ContainerStarted","Data":"57034a1f99e9ca8d5181a10031f674c615ea73b24b6f5d9c8af33805f5057057"} Apr 16 19:19:49.518051 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:49.518018 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-5lqq4" event={"ID":"ce85330e-a89a-4e63-b5d1-7af0e5319b88","Type":"ContainerStarted","Data":"b212098aaee2838274a7f9a9943fa9a7ef71fa869fab340521d76ef3c8915eb7"} Apr 16 19:19:49.518533 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:49.518222 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-5lqq4" Apr 16 19:19:49.519604 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:49.519579 2580 patch_prober.go:28] interesting pod/downloads-6bcc868b7-5lqq4 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.134.0.14:8080/\": dial tcp 10.134.0.14:8080: connect: connection refused" start-of-body= Apr 16 19:19:49.519712 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:49.519627 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-6bcc868b7-5lqq4" podUID="ce85330e-a89a-4e63-b5d1-7af0e5319b88" containerName="download-server" probeResult="failure" output="Get \"http://10.134.0.14:8080/\": dial tcp 10.134.0.14:8080: connect: connection refused" Apr 16 19:19:50.536431 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:50.536392 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-5lqq4" Apr 16 19:19:50.560058 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:19:50.560002 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-5lqq4" podStartSLOduration=2.673869616 podStartE2EDuration="18.559983663s" podCreationTimestamp="2026-04-16 19:19:32 +0000 UTC" firstStartedPulling="2026-04-16 19:19:33.393658636 +0000 UTC m=+80.880856461" lastFinishedPulling="2026-04-16 19:19:49.279772697 +0000 UTC m=+96.766970508" observedRunningTime="2026-04-16 19:19:49.535714383 +0000 UTC m=+97.022912216" watchObservedRunningTime="2026-04-16 19:19:50.559983663 +0000 UTC m=+98.047181496" Apr 16 19:20:22.281109 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:20:22.281075 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-mql76_f792fa48-e0c2-4512-890b-752dbfc3ceaf/kube-state-metrics/0.log" Apr 16 19:20:22.469399 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:20:22.469363 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-mql76_f792fa48-e0c2-4512-890b-752dbfc3ceaf/kube-rbac-proxy-main/0.log" Apr 16 19:20:22.668942 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:20:22.668861 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-mql76_f792fa48-e0c2-4512-890b-752dbfc3ceaf/kube-rbac-proxy-self/0.log" Apr 16 19:20:23.069218 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:20:23.069167 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-bv259_5cd597be-9435-451a-9f7c-11d341f570a5/monitoring-plugin/0.log" Apr 16 19:20:23.269100 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:20:23.269073 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-mhtzt_e39342d6-d92b-4524-a119-2c56664bbc27/init-textfile/0.log" Apr 16 19:20:23.477484 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:20:23.477458 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-mhtzt_e39342d6-d92b-4524-a119-2c56664bbc27/node-exporter/0.log" Apr 16 19:20:23.668725 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:20:23.668693 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-mhtzt_e39342d6-d92b-4524-a119-2c56664bbc27/kube-rbac-proxy/0.log" Apr 16 19:20:27.468814 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:20:27.468781 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-f54jm_c9e5c2e8-1a16-46a7-8cdf-f98fc27ce9a6/prometheus-operator-admission-webhook/0.log" Apr 16 19:20:30.471752 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:20:30.471714 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-5lqq4_ce85330e-a89a-4e63-b5d1-7af0e5319b88/download-server/0.log" Apr 16 19:20:30.869405 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:20:30.869328 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-6pw99_512ebef3-162f-4664-8e29-302b9cbd3861/serve-healthcheck-canary/0.log" Apr 16 19:23:12.976579 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:23:12.976546 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q4qdv_0244c6e9-6611-4147-8e78-0345faffa52e/ovn-acl-logging/0.log" Apr 16 19:23:12.977075 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:23:12.976594 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q4qdv_0244c6e9-6611-4147-8e78-0345faffa52e/ovn-acl-logging/0.log" Apr 16 19:23:12.986238 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:23:12.986217 2580 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 19:26:11.677448 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:26:11.677416 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-qwd72"] Apr 16 19:26:11.680399 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:26:11.680382 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qwd72" Apr 16 19:26:11.684064 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:26:11.684039 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 19:26:11.699491 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:26:11.699458 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-qwd72"] Apr 16 19:26:11.808970 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:26:11.808928 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a0ecccc6-8176-439c-ac99-888ce68e7bf3-kubelet-config\") pod \"global-pull-secret-syncer-qwd72\" (UID: \"a0ecccc6-8176-439c-ac99-888ce68e7bf3\") " pod="kube-system/global-pull-secret-syncer-qwd72" Apr 16 19:26:11.808970 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:26:11.808973 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a0ecccc6-8176-439c-ac99-888ce68e7bf3-original-pull-secret\") pod \"global-pull-secret-syncer-qwd72\" (UID: \"a0ecccc6-8176-439c-ac99-888ce68e7bf3\") " pod="kube-system/global-pull-secret-syncer-qwd72" Apr 16 19:26:11.809226 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:26:11.809069 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a0ecccc6-8176-439c-ac99-888ce68e7bf3-dbus\") pod \"global-pull-secret-syncer-qwd72\" (UID: \"a0ecccc6-8176-439c-ac99-888ce68e7bf3\") " pod="kube-system/global-pull-secret-syncer-qwd72" Apr 16 19:26:11.910149 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:26:11.910112 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a0ecccc6-8176-439c-ac99-888ce68e7bf3-kubelet-config\") pod \"global-pull-secret-syncer-qwd72\" (UID: \"a0ecccc6-8176-439c-ac99-888ce68e7bf3\") " pod="kube-system/global-pull-secret-syncer-qwd72" Apr 16 19:26:11.910149 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:26:11.910152 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a0ecccc6-8176-439c-ac99-888ce68e7bf3-original-pull-secret\") pod \"global-pull-secret-syncer-qwd72\" (UID: \"a0ecccc6-8176-439c-ac99-888ce68e7bf3\") " pod="kube-system/global-pull-secret-syncer-qwd72" Apr 16 19:26:11.910342 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:26:11.910237 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a0ecccc6-8176-439c-ac99-888ce68e7bf3-dbus\") pod \"global-pull-secret-syncer-qwd72\" (UID: \"a0ecccc6-8176-439c-ac99-888ce68e7bf3\") " pod="kube-system/global-pull-secret-syncer-qwd72" Apr 16 19:26:11.910342 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:26:11.910264 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a0ecccc6-8176-439c-ac99-888ce68e7bf3-kubelet-config\") pod \"global-pull-secret-syncer-qwd72\" (UID: \"a0ecccc6-8176-439c-ac99-888ce68e7bf3\") " pod="kube-system/global-pull-secret-syncer-qwd72" Apr 16 19:26:11.910506 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:26:11.910481 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a0ecccc6-8176-439c-ac99-888ce68e7bf3-dbus\") pod \"global-pull-secret-syncer-qwd72\" (UID: \"a0ecccc6-8176-439c-ac99-888ce68e7bf3\") " pod="kube-system/global-pull-secret-syncer-qwd72" Apr 16 19:26:11.912508 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:26:11.912488 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a0ecccc6-8176-439c-ac99-888ce68e7bf3-original-pull-secret\") pod \"global-pull-secret-syncer-qwd72\" (UID: \"a0ecccc6-8176-439c-ac99-888ce68e7bf3\") " pod="kube-system/global-pull-secret-syncer-qwd72" Apr 16 19:26:11.988748 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:26:11.988711 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qwd72" Apr 16 19:26:12.103250 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:26:12.103219 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-qwd72"] Apr 16 19:26:12.106438 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:26:12.106409 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0ecccc6_8176_439c_ac99_888ce68e7bf3.slice/crio-73cb9c743bc3760dce5b049b2d6eda89790049f23110f9f4d270ef9dc74aee62 WatchSource:0}: Error finding container 73cb9c743bc3760dce5b049b2d6eda89790049f23110f9f4d270ef9dc74aee62: Status 404 returned error can't find the container with id 73cb9c743bc3760dce5b049b2d6eda89790049f23110f9f4d270ef9dc74aee62 Apr 16 19:26:12.107797 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:26:12.107782 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:26:12.487353 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:26:12.487313 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-qwd72" event={"ID":"a0ecccc6-8176-439c-ac99-888ce68e7bf3","Type":"ContainerStarted","Data":"73cb9c743bc3760dce5b049b2d6eda89790049f23110f9f4d270ef9dc74aee62"} Apr 16 19:26:16.502888 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:26:16.502853 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-qwd72" event={"ID":"a0ecccc6-8176-439c-ac99-888ce68e7bf3","Type":"ContainerStarted","Data":"feb4a796e4eda9f5e258db8e69623d80af5781395cfca015e68938dd92d73ba6"} Apr 16 19:26:16.521694 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:26:16.521453 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-qwd72" podStartSLOduration=1.313741348 podStartE2EDuration="5.521434456s" podCreationTimestamp="2026-04-16 19:26:11 +0000 UTC" firstStartedPulling="2026-04-16 19:26:12.107907542 +0000 UTC m=+479.595105353" lastFinishedPulling="2026-04-16 19:26:16.315600649 +0000 UTC m=+483.802798461" observedRunningTime="2026-04-16 19:26:16.521209102 +0000 UTC m=+484.008406933" watchObservedRunningTime="2026-04-16 19:26:16.521434456 +0000 UTC m=+484.008632289" Apr 16 19:27:06.718421 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:06.718387 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-7jlnj"] Apr 16 19:27:06.721340 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:06.721325 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-7jlnj" Apr 16 19:27:06.723690 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:06.723669 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 19:27:06.724386 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:06.724367 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-j5kv2\"" Apr 16 19:27:06.724485 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:06.724394 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 19:27:06.730455 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:06.730436 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-7jlnj"] Apr 16 19:27:06.809142 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:06.809110 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f5aa088f-6b86-4605-85aa-e4b77454d308-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-7jlnj\" (UID: \"f5aa088f-6b86-4605-85aa-e4b77454d308\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-7jlnj" Apr 16 19:27:06.809323 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:06.809165 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfpks\" (UniqueName: \"kubernetes.io/projected/f5aa088f-6b86-4605-85aa-e4b77454d308-kube-api-access-wfpks\") pod \"cert-manager-cainjector-8966b78d4-7jlnj\" (UID: \"f5aa088f-6b86-4605-85aa-e4b77454d308\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-7jlnj" Apr 16 19:27:06.909490 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:06.909454 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wfpks\" (UniqueName: \"kubernetes.io/projected/f5aa088f-6b86-4605-85aa-e4b77454d308-kube-api-access-wfpks\") pod \"cert-manager-cainjector-8966b78d4-7jlnj\" (UID: \"f5aa088f-6b86-4605-85aa-e4b77454d308\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-7jlnj" Apr 16 19:27:06.909643 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:06.909507 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f5aa088f-6b86-4605-85aa-e4b77454d308-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-7jlnj\" (UID: \"f5aa088f-6b86-4605-85aa-e4b77454d308\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-7jlnj" Apr 16 19:27:06.922951 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:06.922929 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f5aa088f-6b86-4605-85aa-e4b77454d308-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-7jlnj\" (UID: \"f5aa088f-6b86-4605-85aa-e4b77454d308\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-7jlnj" Apr 16 19:27:06.923105 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:06.923087 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfpks\" (UniqueName: \"kubernetes.io/projected/f5aa088f-6b86-4605-85aa-e4b77454d308-kube-api-access-wfpks\") pod \"cert-manager-cainjector-8966b78d4-7jlnj\" (UID: \"f5aa088f-6b86-4605-85aa-e4b77454d308\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-7jlnj" Apr 16 19:27:07.030872 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:07.030791 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-7jlnj" Apr 16 19:27:07.145481 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:07.145453 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-7jlnj"] Apr 16 19:27:07.149316 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:27:07.149288 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5aa088f_6b86_4605_85aa_e4b77454d308.slice/crio-fcf687a4e30c175317b1bc03cdb12e551a5d16be6bfae51b58f5063d76136504 WatchSource:0}: Error finding container fcf687a4e30c175317b1bc03cdb12e551a5d16be6bfae51b58f5063d76136504: Status 404 returned error can't find the container with id fcf687a4e30c175317b1bc03cdb12e551a5d16be6bfae51b58f5063d76136504 Apr 16 19:27:07.635343 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:07.635314 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-7jlnj" event={"ID":"f5aa088f-6b86-4605-85aa-e4b77454d308","Type":"ContainerStarted","Data":"fcf687a4e30c175317b1bc03cdb12e551a5d16be6bfae51b58f5063d76136504"} Apr 16 19:27:10.645828 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:10.645798 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-7jlnj" event={"ID":"f5aa088f-6b86-4605-85aa-e4b77454d308","Type":"ContainerStarted","Data":"68cdb869cdc2316bbf98697377d4ff7b9a5d71772b12c3f665eefe9ac1793ea6"} Apr 16 19:27:10.661671 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:10.661630 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-7jlnj" podStartSLOduration=1.627756743 podStartE2EDuration="4.661615463s" podCreationTimestamp="2026-04-16 19:27:06 +0000 UTC" firstStartedPulling="2026-04-16 19:27:07.151119596 +0000 UTC m=+534.638317419" lastFinishedPulling="2026-04-16 19:27:10.184978329 +0000 UTC m=+537.672176139" observedRunningTime="2026-04-16 19:27:10.661136397 +0000 UTC m=+538.148334230" watchObservedRunningTime="2026-04-16 19:27:10.661615463 +0000 UTC m=+538.148813295" Apr 16 19:27:30.799242 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:30.799206 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-66b64c949f-kkqkp"] Apr 16 19:27:30.802500 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:30.802478 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-kkqkp" Apr 16 19:27:30.810652 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:30.810633 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 16 19:27:30.810977 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:30.810947 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 16 19:27:30.811101 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:30.810953 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-42x6b\"" Apr 16 19:27:30.811167 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:30.811129 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 16 19:27:30.811602 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:30.811584 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 16 19:27:30.835219 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:30.835174 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-66b64c949f-kkqkp"] Apr 16 19:27:30.876260 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:30.876226 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fhf7\" (UniqueName: \"kubernetes.io/projected/45f8bb8d-c41a-49de-8234-c33c2db30686-kube-api-access-6fhf7\") pod \"opendatahub-operator-controller-manager-66b64c949f-kkqkp\" (UID: \"45f8bb8d-c41a-49de-8234-c33c2db30686\") " pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-kkqkp" Apr 16 19:27:30.876418 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:30.876274 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/45f8bb8d-c41a-49de-8234-c33c2db30686-apiservice-cert\") pod \"opendatahub-operator-controller-manager-66b64c949f-kkqkp\" (UID: \"45f8bb8d-c41a-49de-8234-c33c2db30686\") " pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-kkqkp" Apr 16 19:27:30.876418 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:30.876303 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/45f8bb8d-c41a-49de-8234-c33c2db30686-webhook-cert\") pod \"opendatahub-operator-controller-manager-66b64c949f-kkqkp\" (UID: \"45f8bb8d-c41a-49de-8234-c33c2db30686\") " pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-kkqkp" Apr 16 19:27:30.976659 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:30.976623 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6fhf7\" (UniqueName: \"kubernetes.io/projected/45f8bb8d-c41a-49de-8234-c33c2db30686-kube-api-access-6fhf7\") pod \"opendatahub-operator-controller-manager-66b64c949f-kkqkp\" (UID: \"45f8bb8d-c41a-49de-8234-c33c2db30686\") " pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-kkqkp" Apr 16 19:27:30.976659 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:30.976664 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/45f8bb8d-c41a-49de-8234-c33c2db30686-apiservice-cert\") pod \"opendatahub-operator-controller-manager-66b64c949f-kkqkp\" (UID: \"45f8bb8d-c41a-49de-8234-c33c2db30686\") " pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-kkqkp" Apr 16 19:27:30.976898 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:30.976863 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/45f8bb8d-c41a-49de-8234-c33c2db30686-webhook-cert\") pod \"opendatahub-operator-controller-manager-66b64c949f-kkqkp\" (UID: \"45f8bb8d-c41a-49de-8234-c33c2db30686\") " pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-kkqkp" Apr 16 19:27:30.979109 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:30.979080 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/45f8bb8d-c41a-49de-8234-c33c2db30686-apiservice-cert\") pod \"opendatahub-operator-controller-manager-66b64c949f-kkqkp\" (UID: \"45f8bb8d-c41a-49de-8234-c33c2db30686\") " pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-kkqkp" Apr 16 19:27:30.979233 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:30.979110 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/45f8bb8d-c41a-49de-8234-c33c2db30686-webhook-cert\") pod \"opendatahub-operator-controller-manager-66b64c949f-kkqkp\" (UID: \"45f8bb8d-c41a-49de-8234-c33c2db30686\") " pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-kkqkp" Apr 16 19:27:30.985643 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:30.985624 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fhf7\" (UniqueName: \"kubernetes.io/projected/45f8bb8d-c41a-49de-8234-c33c2db30686-kube-api-access-6fhf7\") pod \"opendatahub-operator-controller-manager-66b64c949f-kkqkp\" (UID: \"45f8bb8d-c41a-49de-8234-c33c2db30686\") " pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-kkqkp" Apr 16 19:27:31.111993 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:31.111891 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-kkqkp" Apr 16 19:27:31.235140 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:31.235109 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-66b64c949f-kkqkp"] Apr 16 19:27:31.238077 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:27:31.238047 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45f8bb8d_c41a_49de_8234_c33c2db30686.slice/crio-ca6734eef67ac9d27ac7087209b501aa0ca47fc6a242a872d8b4bb1d2905387e WatchSource:0}: Error finding container ca6734eef67ac9d27ac7087209b501aa0ca47fc6a242a872d8b4bb1d2905387e: Status 404 returned error can't find the container with id ca6734eef67ac9d27ac7087209b501aa0ca47fc6a242a872d8b4bb1d2905387e Apr 16 19:27:31.701967 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:31.701933 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-kkqkp" event={"ID":"45f8bb8d-c41a-49de-8234-c33c2db30686","Type":"ContainerStarted","Data":"ca6734eef67ac9d27ac7087209b501aa0ca47fc6a242a872d8b4bb1d2905387e"} Apr 16 19:27:33.713002 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:33.712964 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-kkqkp" event={"ID":"45f8bb8d-c41a-49de-8234-c33c2db30686","Type":"ContainerStarted","Data":"d2cd260082781c3a15efb77fa3910218f613de1bea40d5b9c726beaab74ef21a"} Apr 16 19:27:33.713445 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:33.713125 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-kkqkp" Apr 16 19:27:33.736049 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:33.736000 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-kkqkp" podStartSLOduration=1.339494894 podStartE2EDuration="3.7359866s" podCreationTimestamp="2026-04-16 19:27:30 +0000 UTC" firstStartedPulling="2026-04-16 19:27:31.239855479 +0000 UTC m=+558.727053289" lastFinishedPulling="2026-04-16 19:27:33.636347186 +0000 UTC m=+561.123544995" observedRunningTime="2026-04-16 19:27:33.734843313 +0000 UTC m=+561.222041146" watchObservedRunningTime="2026-04-16 19:27:33.7359866 +0000 UTC m=+561.223184431" Apr 16 19:27:34.225166 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:34.225130 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-66b4cb6588-4h2d2"] Apr 16 19:27:34.228429 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:34.228414 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-66b4cb6588-4h2d2" Apr 16 19:27:34.232365 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:34.232339 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 19:27:34.232487 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:34.232367 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 19:27:34.232487 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:34.232345 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-bmv5f\"" Apr 16 19:27:34.232487 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:34.232339 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 19:27:34.232920 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:34.232904 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 19:27:34.235968 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:34.235948 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:27:34.252733 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:34.252710 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-66b4cb6588-4h2d2"] Apr 16 19:27:34.304861 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:34.304828 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/c6dea8fc-3f21-4202-975b-5d3b715d3b90-manager-config\") pod \"lws-controller-manager-66b4cb6588-4h2d2\" (UID: \"c6dea8fc-3f21-4202-975b-5d3b715d3b90\") " pod="openshift-lws-operator/lws-controller-manager-66b4cb6588-4h2d2" Apr 16 19:27:34.305063 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:34.304870 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqrrs\" (UniqueName: \"kubernetes.io/projected/c6dea8fc-3f21-4202-975b-5d3b715d3b90-kube-api-access-zqrrs\") pod \"lws-controller-manager-66b4cb6588-4h2d2\" (UID: \"c6dea8fc-3f21-4202-975b-5d3b715d3b90\") " pod="openshift-lws-operator/lws-controller-manager-66b4cb6588-4h2d2" Apr 16 19:27:34.305063 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:34.304985 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c6dea8fc-3f21-4202-975b-5d3b715d3b90-cert\") pod \"lws-controller-manager-66b4cb6588-4h2d2\" (UID: \"c6dea8fc-3f21-4202-975b-5d3b715d3b90\") " pod="openshift-lws-operator/lws-controller-manager-66b4cb6588-4h2d2" Apr 16 19:27:34.305063 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:34.305040 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/c6dea8fc-3f21-4202-975b-5d3b715d3b90-metrics-cert\") pod \"lws-controller-manager-66b4cb6588-4h2d2\" (UID: \"c6dea8fc-3f21-4202-975b-5d3b715d3b90\") " pod="openshift-lws-operator/lws-controller-manager-66b4cb6588-4h2d2" Apr 16 19:27:34.406423 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:34.406388 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/c6dea8fc-3f21-4202-975b-5d3b715d3b90-manager-config\") pod \"lws-controller-manager-66b4cb6588-4h2d2\" (UID: \"c6dea8fc-3f21-4202-975b-5d3b715d3b90\") " pod="openshift-lws-operator/lws-controller-manager-66b4cb6588-4h2d2" Apr 16 19:27:34.406617 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:34.406432 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zqrrs\" (UniqueName: \"kubernetes.io/projected/c6dea8fc-3f21-4202-975b-5d3b715d3b90-kube-api-access-zqrrs\") pod \"lws-controller-manager-66b4cb6588-4h2d2\" (UID: \"c6dea8fc-3f21-4202-975b-5d3b715d3b90\") " pod="openshift-lws-operator/lws-controller-manager-66b4cb6588-4h2d2" Apr 16 19:27:34.406617 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:34.406484 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c6dea8fc-3f21-4202-975b-5d3b715d3b90-cert\") pod \"lws-controller-manager-66b4cb6588-4h2d2\" (UID: \"c6dea8fc-3f21-4202-975b-5d3b715d3b90\") " pod="openshift-lws-operator/lws-controller-manager-66b4cb6588-4h2d2" Apr 16 19:27:34.406617 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:34.406532 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/c6dea8fc-3f21-4202-975b-5d3b715d3b90-metrics-cert\") pod \"lws-controller-manager-66b4cb6588-4h2d2\" (UID: \"c6dea8fc-3f21-4202-975b-5d3b715d3b90\") " pod="openshift-lws-operator/lws-controller-manager-66b4cb6588-4h2d2" Apr 16 19:27:34.407222 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:34.407173 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/c6dea8fc-3f21-4202-975b-5d3b715d3b90-manager-config\") pod \"lws-controller-manager-66b4cb6588-4h2d2\" (UID: \"c6dea8fc-3f21-4202-975b-5d3b715d3b90\") " pod="openshift-lws-operator/lws-controller-manager-66b4cb6588-4h2d2" Apr 16 19:27:34.409069 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:34.409044 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/c6dea8fc-3f21-4202-975b-5d3b715d3b90-metrics-cert\") pod \"lws-controller-manager-66b4cb6588-4h2d2\" (UID: \"c6dea8fc-3f21-4202-975b-5d3b715d3b90\") " pod="openshift-lws-operator/lws-controller-manager-66b4cb6588-4h2d2" Apr 16 19:27:34.409183 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:34.409103 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c6dea8fc-3f21-4202-975b-5d3b715d3b90-cert\") pod \"lws-controller-manager-66b4cb6588-4h2d2\" (UID: \"c6dea8fc-3f21-4202-975b-5d3b715d3b90\") " pod="openshift-lws-operator/lws-controller-manager-66b4cb6588-4h2d2" Apr 16 19:27:34.415318 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:34.415296 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqrrs\" (UniqueName: \"kubernetes.io/projected/c6dea8fc-3f21-4202-975b-5d3b715d3b90-kube-api-access-zqrrs\") pod \"lws-controller-manager-66b4cb6588-4h2d2\" (UID: \"c6dea8fc-3f21-4202-975b-5d3b715d3b90\") " pod="openshift-lws-operator/lws-controller-manager-66b4cb6588-4h2d2" Apr 16 19:27:34.539279 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:34.539183 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-66b4cb6588-4h2d2" Apr 16 19:27:34.667555 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:34.667509 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-66b4cb6588-4h2d2"] Apr 16 19:27:34.668914 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:27:34.668885 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6dea8fc_3f21_4202_975b_5d3b715d3b90.slice/crio-510013aacdb8571172911bcde4590fff480812aea6b9ec8d678d321be8908602 WatchSource:0}: Error finding container 510013aacdb8571172911bcde4590fff480812aea6b9ec8d678d321be8908602: Status 404 returned error can't find the container with id 510013aacdb8571172911bcde4590fff480812aea6b9ec8d678d321be8908602 Apr 16 19:27:34.717857 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:34.717815 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-66b4cb6588-4h2d2" event={"ID":"c6dea8fc-3f21-4202-975b-5d3b715d3b90","Type":"ContainerStarted","Data":"510013aacdb8571172911bcde4590fff480812aea6b9ec8d678d321be8908602"} Apr 16 19:27:37.728650 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:37.728618 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-66b4cb6588-4h2d2" event={"ID":"c6dea8fc-3f21-4202-975b-5d3b715d3b90","Type":"ContainerStarted","Data":"5e9f69708916ceda5ae5236648bf9ca9e174e60e91a74e991f2568f48b451227"} Apr 16 19:27:37.729132 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:37.728668 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-66b4cb6588-4h2d2" Apr 16 19:27:37.747058 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:37.747013 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-66b4cb6588-4h2d2" podStartSLOduration=1.362008404 podStartE2EDuration="3.746998085s" podCreationTimestamp="2026-04-16 19:27:34 +0000 UTC" firstStartedPulling="2026-04-16 19:27:34.671111989 +0000 UTC m=+562.158309799" lastFinishedPulling="2026-04-16 19:27:37.05610167 +0000 UTC m=+564.543299480" observedRunningTime="2026-04-16 19:27:37.746303295 +0000 UTC m=+565.233501126" watchObservedRunningTime="2026-04-16 19:27:37.746998085 +0000 UTC m=+565.234195918" Apr 16 19:27:44.719778 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:44.719746 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-kkqkp" Apr 16 19:27:48.733439 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:27:48.733406 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-66b4cb6588-4h2d2" Apr 16 19:28:12.999561 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:12.999533 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q4qdv_0244c6e9-6611-4147-8e78-0345faffa52e/ovn-acl-logging/0.log" Apr 16 19:28:13.000374 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:13.000355 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q4qdv_0244c6e9-6611-4147-8e78-0345faffa52e/ovn-acl-logging/0.log" Apr 16 19:28:54.348040 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.348005 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn"] Apr 16 19:28:54.351484 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.351465 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" Apr 16 19:28:54.354040 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.354015 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 19:28:54.354167 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.354037 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-kn2qd\"" Apr 16 19:28:54.354167 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.354017 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 19:28:54.354167 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.354015 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 19:28:54.362527 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.362504 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn"] Apr 16 19:28:54.416248 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.416173 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6aadfa45-521b-495f-b191-8169463e3cd3-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn\" (UID: \"6aadfa45-521b-495f-b191-8169463e3cd3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" Apr 16 19:28:54.416454 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.416257 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/6aadfa45-521b-495f-b191-8169463e3cd3-istio-data\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn\" (UID: \"6aadfa45-521b-495f-b191-8169463e3cd3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" Apr 16 19:28:54.416454 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.416280 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/6aadfa45-521b-495f-b191-8169463e3cd3-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn\" (UID: \"6aadfa45-521b-495f-b191-8169463e3cd3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" Apr 16 19:28:54.416454 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.416300 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/6aadfa45-521b-495f-b191-8169463e3cd3-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn\" (UID: \"6aadfa45-521b-495f-b191-8169463e3cd3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" Apr 16 19:28:54.416454 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.416321 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjh6k\" (UniqueName: \"kubernetes.io/projected/6aadfa45-521b-495f-b191-8169463e3cd3-kube-api-access-sjh6k\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn\" (UID: \"6aadfa45-521b-495f-b191-8169463e3cd3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" Apr 16 19:28:54.416454 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.416411 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/6aadfa45-521b-495f-b191-8169463e3cd3-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn\" (UID: \"6aadfa45-521b-495f-b191-8169463e3cd3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" Apr 16 19:28:54.416454 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.416446 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/6aadfa45-521b-495f-b191-8169463e3cd3-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn\" (UID: \"6aadfa45-521b-495f-b191-8169463e3cd3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" Apr 16 19:28:54.416759 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.416482 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6aadfa45-521b-495f-b191-8169463e3cd3-istio-token\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn\" (UID: \"6aadfa45-521b-495f-b191-8169463e3cd3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" Apr 16 19:28:54.416759 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.416506 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/6aadfa45-521b-495f-b191-8169463e3cd3-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn\" (UID: \"6aadfa45-521b-495f-b191-8169463e3cd3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" Apr 16 19:28:54.516867 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.516827 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6aadfa45-521b-495f-b191-8169463e3cd3-istio-token\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn\" (UID: \"6aadfa45-521b-495f-b191-8169463e3cd3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" Apr 16 19:28:54.516867 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.516872 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/6aadfa45-521b-495f-b191-8169463e3cd3-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn\" (UID: \"6aadfa45-521b-495f-b191-8169463e3cd3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" Apr 16 19:28:54.517112 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.516912 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6aadfa45-521b-495f-b191-8169463e3cd3-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn\" (UID: \"6aadfa45-521b-495f-b191-8169463e3cd3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" Apr 16 19:28:54.517112 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.516954 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/6aadfa45-521b-495f-b191-8169463e3cd3-istio-data\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn\" (UID: \"6aadfa45-521b-495f-b191-8169463e3cd3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" Apr 16 19:28:54.517112 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.516981 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/6aadfa45-521b-495f-b191-8169463e3cd3-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn\" (UID: \"6aadfa45-521b-495f-b191-8169463e3cd3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" Apr 16 19:28:54.517309 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.517276 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/6aadfa45-521b-495f-b191-8169463e3cd3-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn\" (UID: \"6aadfa45-521b-495f-b191-8169463e3cd3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" Apr 16 19:28:54.517474 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.517452 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/6aadfa45-521b-495f-b191-8169463e3cd3-istio-data\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn\" (UID: \"6aadfa45-521b-495f-b191-8169463e3cd3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" Apr 16 19:28:54.517474 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.517465 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjh6k\" (UniqueName: \"kubernetes.io/projected/6aadfa45-521b-495f-b191-8169463e3cd3-kube-api-access-sjh6k\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn\" (UID: \"6aadfa45-521b-495f-b191-8169463e3cd3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" Apr 16 19:28:54.517607 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.517363 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/6aadfa45-521b-495f-b191-8169463e3cd3-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn\" (UID: \"6aadfa45-521b-495f-b191-8169463e3cd3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" Apr 16 19:28:54.517607 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.517400 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/6aadfa45-521b-495f-b191-8169463e3cd3-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn\" (UID: \"6aadfa45-521b-495f-b191-8169463e3cd3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" Apr 16 19:28:54.517607 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.517554 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/6aadfa45-521b-495f-b191-8169463e3cd3-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn\" (UID: \"6aadfa45-521b-495f-b191-8169463e3cd3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" Apr 16 19:28:54.517607 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.517588 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/6aadfa45-521b-495f-b191-8169463e3cd3-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn\" (UID: \"6aadfa45-521b-495f-b191-8169463e3cd3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" Apr 16 19:28:54.517607 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.517591 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/6aadfa45-521b-495f-b191-8169463e3cd3-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn\" (UID: \"6aadfa45-521b-495f-b191-8169463e3cd3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" Apr 16 19:28:54.518183 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.518162 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/6aadfa45-521b-495f-b191-8169463e3cd3-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn\" (UID: \"6aadfa45-521b-495f-b191-8169463e3cd3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" Apr 16 19:28:54.519643 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.519614 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6aadfa45-521b-495f-b191-8169463e3cd3-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn\" (UID: \"6aadfa45-521b-495f-b191-8169463e3cd3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" Apr 16 19:28:54.519729 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.519669 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/6aadfa45-521b-495f-b191-8169463e3cd3-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn\" (UID: \"6aadfa45-521b-495f-b191-8169463e3cd3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" Apr 16 19:28:54.530785 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.530761 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6aadfa45-521b-495f-b191-8169463e3cd3-istio-token\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn\" (UID: \"6aadfa45-521b-495f-b191-8169463e3cd3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" Apr 16 19:28:54.530997 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.530975 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjh6k\" (UniqueName: \"kubernetes.io/projected/6aadfa45-521b-495f-b191-8169463e3cd3-kube-api-access-sjh6k\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn\" (UID: \"6aadfa45-521b-495f-b191-8169463e3cd3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" Apr 16 19:28:54.598717 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.598608 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq"] Apr 16 19:28:54.602821 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.602798 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq" Apr 16 19:28:54.618583 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.617143 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq"] Apr 16 19:28:54.619951 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.619920 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/daf7c7ae-16fc-40bf-a381-013352b8d1b0-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq\" (UID: \"daf7c7ae-16fc-40bf-a381-013352b8d1b0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq" Apr 16 19:28:54.620532 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.620438 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/daf7c7ae-16fc-40bf-a381-013352b8d1b0-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq\" (UID: \"daf7c7ae-16fc-40bf-a381-013352b8d1b0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq" Apr 16 19:28:54.620532 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.620502 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/daf7c7ae-16fc-40bf-a381-013352b8d1b0-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq\" (UID: \"daf7c7ae-16fc-40bf-a381-013352b8d1b0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq" Apr 16 19:28:54.620718 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.620548 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/daf7c7ae-16fc-40bf-a381-013352b8d1b0-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq\" (UID: \"daf7c7ae-16fc-40bf-a381-013352b8d1b0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq" Apr 16 19:28:54.620718 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.620581 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/daf7c7ae-16fc-40bf-a381-013352b8d1b0-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq\" (UID: \"daf7c7ae-16fc-40bf-a381-013352b8d1b0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq" Apr 16 19:28:54.620718 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.620616 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/daf7c7ae-16fc-40bf-a381-013352b8d1b0-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq\" (UID: \"daf7c7ae-16fc-40bf-a381-013352b8d1b0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq" Apr 16 19:28:54.620718 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.620639 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/daf7c7ae-16fc-40bf-a381-013352b8d1b0-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq\" (UID: \"daf7c7ae-16fc-40bf-a381-013352b8d1b0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq" Apr 16 19:28:54.620718 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.620670 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/daf7c7ae-16fc-40bf-a381-013352b8d1b0-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq\" (UID: \"daf7c7ae-16fc-40bf-a381-013352b8d1b0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq" Apr 16 19:28:54.620718 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.620696 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhbdt\" (UniqueName: \"kubernetes.io/projected/daf7c7ae-16fc-40bf-a381-013352b8d1b0-kube-api-access-rhbdt\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq\" (UID: \"daf7c7ae-16fc-40bf-a381-013352b8d1b0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq" Apr 16 19:28:54.662891 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.662850 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" Apr 16 19:28:54.721992 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.721791 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/daf7c7ae-16fc-40bf-a381-013352b8d1b0-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq\" (UID: \"daf7c7ae-16fc-40bf-a381-013352b8d1b0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq" Apr 16 19:28:54.722154 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.722057 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/daf7c7ae-16fc-40bf-a381-013352b8d1b0-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq\" (UID: \"daf7c7ae-16fc-40bf-a381-013352b8d1b0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq" Apr 16 19:28:54.722154 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.722102 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/daf7c7ae-16fc-40bf-a381-013352b8d1b0-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq\" (UID: \"daf7c7ae-16fc-40bf-a381-013352b8d1b0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq" Apr 16 19:28:54.722154 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.722144 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/daf7c7ae-16fc-40bf-a381-013352b8d1b0-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq\" (UID: \"daf7c7ae-16fc-40bf-a381-013352b8d1b0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq" Apr 16 19:28:54.722363 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.722234 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/daf7c7ae-16fc-40bf-a381-013352b8d1b0-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq\" (UID: \"daf7c7ae-16fc-40bf-a381-013352b8d1b0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq" Apr 16 19:28:54.722680 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.722180 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/daf7c7ae-16fc-40bf-a381-013352b8d1b0-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq\" (UID: \"daf7c7ae-16fc-40bf-a381-013352b8d1b0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq" Apr 16 19:28:54.722782 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.722707 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/daf7c7ae-16fc-40bf-a381-013352b8d1b0-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq\" (UID: \"daf7c7ae-16fc-40bf-a381-013352b8d1b0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq" Apr 16 19:28:54.722782 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.722742 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/daf7c7ae-16fc-40bf-a381-013352b8d1b0-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq\" (UID: \"daf7c7ae-16fc-40bf-a381-013352b8d1b0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq" Apr 16 19:28:54.722782 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.722778 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/daf7c7ae-16fc-40bf-a381-013352b8d1b0-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq\" (UID: \"daf7c7ae-16fc-40bf-a381-013352b8d1b0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq" Apr 16 19:28:54.722924 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.722804 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhbdt\" (UniqueName: \"kubernetes.io/projected/daf7c7ae-16fc-40bf-a381-013352b8d1b0-kube-api-access-rhbdt\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq\" (UID: \"daf7c7ae-16fc-40bf-a381-013352b8d1b0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq" Apr 16 19:28:54.722924 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.722848 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/daf7c7ae-16fc-40bf-a381-013352b8d1b0-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq\" (UID: \"daf7c7ae-16fc-40bf-a381-013352b8d1b0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq" Apr 16 19:28:54.723678 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.723106 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/daf7c7ae-16fc-40bf-a381-013352b8d1b0-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq\" (UID: \"daf7c7ae-16fc-40bf-a381-013352b8d1b0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq" Apr 16 19:28:54.723678 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.723491 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/daf7c7ae-16fc-40bf-a381-013352b8d1b0-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq\" (UID: \"daf7c7ae-16fc-40bf-a381-013352b8d1b0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq" Apr 16 19:28:54.723678 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.723643 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/daf7c7ae-16fc-40bf-a381-013352b8d1b0-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq\" (UID: \"daf7c7ae-16fc-40bf-a381-013352b8d1b0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq" Apr 16 19:28:54.726286 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.726249 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/daf7c7ae-16fc-40bf-a381-013352b8d1b0-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq\" (UID: \"daf7c7ae-16fc-40bf-a381-013352b8d1b0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq" Apr 16 19:28:54.726976 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.726934 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/daf7c7ae-16fc-40bf-a381-013352b8d1b0-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq\" (UID: \"daf7c7ae-16fc-40bf-a381-013352b8d1b0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq" Apr 16 19:28:54.730439 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.730406 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/daf7c7ae-16fc-40bf-a381-013352b8d1b0-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq\" (UID: \"daf7c7ae-16fc-40bf-a381-013352b8d1b0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq" Apr 16 19:28:54.730615 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.730597 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhbdt\" (UniqueName: \"kubernetes.io/projected/daf7c7ae-16fc-40bf-a381-013352b8d1b0-kube-api-access-rhbdt\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq\" (UID: \"daf7c7ae-16fc-40bf-a381-013352b8d1b0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq" Apr 16 19:28:54.794779 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.794740 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn"] Apr 16 19:28:54.797425 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:28:54.797393 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6aadfa45_521b_495f_b191_8169463e3cd3.slice/crio-57467c34979641dec59525aaac4ca07fbe978db47dc9fe8dd0c466b9c1794c8a WatchSource:0}: Error finding container 57467c34979641dec59525aaac4ca07fbe978db47dc9fe8dd0c466b9c1794c8a: Status 404 returned error can't find the container with id 57467c34979641dec59525aaac4ca07fbe978db47dc9fe8dd0c466b9c1794c8a Apr 16 19:28:54.922653 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.922551 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq" Apr 16 19:28:54.937947 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:54.937916 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" event={"ID":"6aadfa45-521b-495f-b191-8169463e3cd3","Type":"ContainerStarted","Data":"57467c34979641dec59525aaac4ca07fbe978db47dc9fe8dd0c466b9c1794c8a"} Apr 16 19:28:55.050502 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:55.050468 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq"] Apr 16 19:28:55.052464 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:28:55.052432 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddaf7c7ae_16fc_40bf_a381_013352b8d1b0.slice/crio-c845559b2e2fefd4509db1d2b391337315626838b247472cd45eb39bbf633d08 WatchSource:0}: Error finding container c845559b2e2fefd4509db1d2b391337315626838b247472cd45eb39bbf633d08: Status 404 returned error can't find the container with id c845559b2e2fefd4509db1d2b391337315626838b247472cd45eb39bbf633d08 Apr 16 19:28:55.943161 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:55.943116 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq" event={"ID":"daf7c7ae-16fc-40bf-a381-013352b8d1b0","Type":"ContainerStarted","Data":"c845559b2e2fefd4509db1d2b391337315626838b247472cd45eb39bbf633d08"} Apr 16 19:28:57.298123 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:57.298064 2580 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 19:28:57.298524 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:57.298168 2580 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 19:28:57.298524 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:57.298231 2580 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 19:28:57.303657 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:57.303632 2580 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 19:28:57.303731 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:57.303687 2580 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 19:28:57.303768 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:57.303735 2580 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 19:28:57.952995 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:57.952954 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" event={"ID":"6aadfa45-521b-495f-b191-8169463e3cd3","Type":"ContainerStarted","Data":"64a609bf309c5f95801c0c58dba5b5f068585e6ae05a5c32475c1454ea220b26"} Apr 16 19:28:57.954372 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:57.954344 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq" event={"ID":"daf7c7ae-16fc-40bf-a381-013352b8d1b0","Type":"ContainerStarted","Data":"912b25ecf733d652bba347ec9545f7aaa940f36002dfe5f239b09711c4ce6171"} Apr 16 19:28:57.972828 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:57.972781 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" podStartSLOduration=1.474137286 podStartE2EDuration="3.972765445s" podCreationTimestamp="2026-04-16 19:28:54 +0000 UTC" firstStartedPulling="2026-04-16 19:28:54.799171535 +0000 UTC m=+642.286369344" lastFinishedPulling="2026-04-16 19:28:57.297799694 +0000 UTC m=+644.784997503" observedRunningTime="2026-04-16 19:28:57.971359726 +0000 UTC m=+645.458557559" watchObservedRunningTime="2026-04-16 19:28:57.972765445 +0000 UTC m=+645.459963269" Apr 16 19:28:57.991306 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:57.991243 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq" podStartSLOduration=1.7422254449999999 podStartE2EDuration="3.991223723s" podCreationTimestamp="2026-04-16 19:28:54 +0000 UTC" firstStartedPulling="2026-04-16 19:28:55.054428606 +0000 UTC m=+642.541626417" lastFinishedPulling="2026-04-16 19:28:57.30342688 +0000 UTC m=+644.790624695" observedRunningTime="2026-04-16 19:28:57.990832918 +0000 UTC m=+645.478030771" watchObservedRunningTime="2026-04-16 19:28:57.991223723 +0000 UTC m=+645.478421555" Apr 16 19:28:58.663235 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:58.663174 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" Apr 16 19:28:58.664591 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:58.664566 2580 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.134.0.19:15021/healthz/ready\": dial tcp 10.134.0.19:15021: connect: connection refused" start-of-body= Apr 16 19:28:58.664710 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:58.664615 2580 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" podUID="6aadfa45-521b-495f-b191-8169463e3cd3" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.19:15021/healthz/ready\": dial tcp 10.134.0.19:15021: connect: connection refused" Apr 16 19:28:58.923422 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:58.923334 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq" Apr 16 19:28:58.928069 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:58.928044 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq" Apr 16 19:28:58.957868 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:58.957838 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq" Apr 16 19:28:58.958734 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:58.958716 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq" Apr 16 19:28:59.008993 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:59.008961 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn"] Apr 16 19:28:59.663377 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:59.663339 2580 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.134.0.19:15021/healthz/ready\": dial tcp 10.134.0.19:15021: connect: connection refused" start-of-body= Apr 16 19:28:59.663797 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:28:59.663402 2580 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" podUID="6aadfa45-521b-495f-b191-8169463e3cd3" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.19:15021/healthz/ready\": dial tcp 10.134.0.19:15021: connect: connection refused" Apr 16 19:29:00.663627 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:00.663588 2580 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.134.0.19:15021/healthz/ready\": dial tcp 10.134.0.19:15021: connect: connection refused" start-of-body= Apr 16 19:29:00.664027 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:00.663671 2580 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" podUID="6aadfa45-521b-495f-b191-8169463e3cd3" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.19:15021/healthz/ready\": dial tcp 10.134.0.19:15021: connect: connection refused" Apr 16 19:29:00.964262 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:00.964203 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" podUID="6aadfa45-521b-495f-b191-8169463e3cd3" containerName="istio-proxy" containerID="cri-o://64a609bf309c5f95801c0c58dba5b5f068585e6ae05a5c32475c1454ea220b26" gracePeriod=30 Apr 16 19:29:06.203382 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:06.203358 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" Apr 16 19:29:06.333238 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:06.333123 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/6aadfa45-521b-495f-b191-8169463e3cd3-istio-envoy\") pod \"6aadfa45-521b-495f-b191-8169463e3cd3\" (UID: \"6aadfa45-521b-495f-b191-8169463e3cd3\") " Apr 16 19:29:06.333238 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:06.333160 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6aadfa45-521b-495f-b191-8169463e3cd3-istio-podinfo\") pod \"6aadfa45-521b-495f-b191-8169463e3cd3\" (UID: \"6aadfa45-521b-495f-b191-8169463e3cd3\") " Apr 16 19:29:06.333238 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:06.333183 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjh6k\" (UniqueName: \"kubernetes.io/projected/6aadfa45-521b-495f-b191-8169463e3cd3-kube-api-access-sjh6k\") pod \"6aadfa45-521b-495f-b191-8169463e3cd3\" (UID: \"6aadfa45-521b-495f-b191-8169463e3cd3\") " Apr 16 19:29:06.333238 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:06.333226 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/6aadfa45-521b-495f-b191-8169463e3cd3-workload-certs\") pod \"6aadfa45-521b-495f-b191-8169463e3cd3\" (UID: \"6aadfa45-521b-495f-b191-8169463e3cd3\") " Apr 16 19:29:06.333570 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:06.333272 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/6aadfa45-521b-495f-b191-8169463e3cd3-credential-socket\") pod \"6aadfa45-521b-495f-b191-8169463e3cd3\" (UID: \"6aadfa45-521b-495f-b191-8169463e3cd3\") " Apr 16 19:29:06.333570 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:06.333299 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/6aadfa45-521b-495f-b191-8169463e3cd3-istiod-ca-cert\") pod \"6aadfa45-521b-495f-b191-8169463e3cd3\" (UID: \"6aadfa45-521b-495f-b191-8169463e3cd3\") " Apr 16 19:29:06.333570 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:06.333362 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/6aadfa45-521b-495f-b191-8169463e3cd3-workload-socket\") pod \"6aadfa45-521b-495f-b191-8169463e3cd3\" (UID: \"6aadfa45-521b-495f-b191-8169463e3cd3\") " Apr 16 19:29:06.333570 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:06.333389 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/6aadfa45-521b-495f-b191-8169463e3cd3-istio-data\") pod \"6aadfa45-521b-495f-b191-8169463e3cd3\" (UID: \"6aadfa45-521b-495f-b191-8169463e3cd3\") " Apr 16 19:29:06.333570 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:06.333416 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6aadfa45-521b-495f-b191-8169463e3cd3-istio-token\") pod \"6aadfa45-521b-495f-b191-8169463e3cd3\" (UID: \"6aadfa45-521b-495f-b191-8169463e3cd3\") " Apr 16 19:29:06.333570 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:06.333537 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6aadfa45-521b-495f-b191-8169463e3cd3-workload-certs" (OuterVolumeSpecName: "workload-certs") pod "6aadfa45-521b-495f-b191-8169463e3cd3" (UID: "6aadfa45-521b-495f-b191-8169463e3cd3"). InnerVolumeSpecName "workload-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:29:06.333810 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:06.333600 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6aadfa45-521b-495f-b191-8169463e3cd3-credential-socket" (OuterVolumeSpecName: "credential-socket") pod "6aadfa45-521b-495f-b191-8169463e3cd3" (UID: "6aadfa45-521b-495f-b191-8169463e3cd3"). InnerVolumeSpecName "credential-socket". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:29:06.333810 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:06.333678 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6aadfa45-521b-495f-b191-8169463e3cd3-workload-socket" (OuterVolumeSpecName: "workload-socket") pod "6aadfa45-521b-495f-b191-8169463e3cd3" (UID: "6aadfa45-521b-495f-b191-8169463e3cd3"). InnerVolumeSpecName "workload-socket". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:29:06.333810 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:06.333709 2580 reconciler_common.go:299] "Volume detached for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/6aadfa45-521b-495f-b191-8169463e3cd3-workload-certs\") on node \"ip-10-0-130-83.ec2.internal\" DevicePath \"\"" Apr 16 19:29:06.333810 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:06.333728 2580 reconciler_common.go:299] "Volume detached for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/6aadfa45-521b-495f-b191-8169463e3cd3-credential-socket\") on node \"ip-10-0-130-83.ec2.internal\" DevicePath \"\"" Apr 16 19:29:06.333810 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:06.333727 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6aadfa45-521b-495f-b191-8169463e3cd3-istio-data" (OuterVolumeSpecName: "istio-data") pod "6aadfa45-521b-495f-b191-8169463e3cd3" (UID: "6aadfa45-521b-495f-b191-8169463e3cd3"). InnerVolumeSpecName "istio-data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:29:06.333810 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:06.333747 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6aadfa45-521b-495f-b191-8169463e3cd3-istiod-ca-cert" (OuterVolumeSpecName: "istiod-ca-cert") pod "6aadfa45-521b-495f-b191-8169463e3cd3" (UID: "6aadfa45-521b-495f-b191-8169463e3cd3"). InnerVolumeSpecName "istiod-ca-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:29:06.335481 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:06.335453 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6aadfa45-521b-495f-b191-8169463e3cd3-istio-envoy" (OuterVolumeSpecName: "istio-envoy") pod "6aadfa45-521b-495f-b191-8169463e3cd3" (UID: "6aadfa45-521b-495f-b191-8169463e3cd3"). InnerVolumeSpecName "istio-envoy". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:29:06.335619 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:06.335599 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aadfa45-521b-495f-b191-8169463e3cd3-kube-api-access-sjh6k" (OuterVolumeSpecName: "kube-api-access-sjh6k") pod "6aadfa45-521b-495f-b191-8169463e3cd3" (UID: "6aadfa45-521b-495f-b191-8169463e3cd3"). InnerVolumeSpecName "kube-api-access-sjh6k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:29:06.335677 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:06.335616 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aadfa45-521b-495f-b191-8169463e3cd3-istio-token" (OuterVolumeSpecName: "istio-token") pod "6aadfa45-521b-495f-b191-8169463e3cd3" (UID: "6aadfa45-521b-495f-b191-8169463e3cd3"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:29:06.335677 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:06.335660 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/6aadfa45-521b-495f-b191-8169463e3cd3-istio-podinfo" (OuterVolumeSpecName: "istio-podinfo") pod "6aadfa45-521b-495f-b191-8169463e3cd3" (UID: "6aadfa45-521b-495f-b191-8169463e3cd3"). InnerVolumeSpecName "istio-podinfo". PluginName "kubernetes.io/downward-api", VolumeGIDValue "" Apr 16 19:29:06.434515 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:06.434476 2580 reconciler_common.go:299] "Volume detached for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/6aadfa45-521b-495f-b191-8169463e3cd3-workload-socket\") on node \"ip-10-0-130-83.ec2.internal\" DevicePath \"\"" Apr 16 19:29:06.434515 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:06.434508 2580 reconciler_common.go:299] "Volume detached for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/6aadfa45-521b-495f-b191-8169463e3cd3-istio-data\") on node \"ip-10-0-130-83.ec2.internal\" DevicePath \"\"" Apr 16 19:29:06.434515 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:06.434517 2580 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6aadfa45-521b-495f-b191-8169463e3cd3-istio-token\") on node \"ip-10-0-130-83.ec2.internal\" DevicePath \"\"" Apr 16 19:29:06.434515 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:06.434526 2580 reconciler_common.go:299] "Volume detached for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/6aadfa45-521b-495f-b191-8169463e3cd3-istio-envoy\") on node \"ip-10-0-130-83.ec2.internal\" DevicePath \"\"" Apr 16 19:29:06.434785 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:06.434534 2580 reconciler_common.go:299] "Volume detached for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6aadfa45-521b-495f-b191-8169463e3cd3-istio-podinfo\") on node \"ip-10-0-130-83.ec2.internal\" DevicePath \"\"" Apr 16 19:29:06.434785 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:06.434546 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sjh6k\" (UniqueName: \"kubernetes.io/projected/6aadfa45-521b-495f-b191-8169463e3cd3-kube-api-access-sjh6k\") on node \"ip-10-0-130-83.ec2.internal\" DevicePath \"\"" Apr 16 19:29:06.434785 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:06.434560 2580 reconciler_common.go:299] "Volume detached for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/6aadfa45-521b-495f-b191-8169463e3cd3-istiod-ca-cert\") on node \"ip-10-0-130-83.ec2.internal\" DevicePath \"\"" Apr 16 19:29:06.983363 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:06.983324 2580 generic.go:358] "Generic (PLEG): container finished" podID="6aadfa45-521b-495f-b191-8169463e3cd3" containerID="64a609bf309c5f95801c0c58dba5b5f068585e6ae05a5c32475c1454ea220b26" exitCode=0 Apr 16 19:29:06.983524 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:06.983372 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" event={"ID":"6aadfa45-521b-495f-b191-8169463e3cd3","Type":"ContainerDied","Data":"64a609bf309c5f95801c0c58dba5b5f068585e6ae05a5c32475c1454ea220b26"} Apr 16 19:29:06.983524 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:06.983394 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" Apr 16 19:29:06.983524 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:06.983406 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn" event={"ID":"6aadfa45-521b-495f-b191-8169463e3cd3","Type":"ContainerDied","Data":"57467c34979641dec59525aaac4ca07fbe978db47dc9fe8dd0c466b9c1794c8a"} Apr 16 19:29:06.983524 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:06.983425 2580 scope.go:117] "RemoveContainer" containerID="64a609bf309c5f95801c0c58dba5b5f068585e6ae05a5c32475c1454ea220b26" Apr 16 19:29:06.991927 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:06.991909 2580 scope.go:117] "RemoveContainer" containerID="64a609bf309c5f95801c0c58dba5b5f068585e6ae05a5c32475c1454ea220b26" Apr 16 19:29:06.992164 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:29:06.992144 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64a609bf309c5f95801c0c58dba5b5f068585e6ae05a5c32475c1454ea220b26\": container with ID starting with 64a609bf309c5f95801c0c58dba5b5f068585e6ae05a5c32475c1454ea220b26 not found: ID does not exist" containerID="64a609bf309c5f95801c0c58dba5b5f068585e6ae05a5c32475c1454ea220b26" Apr 16 19:29:06.992304 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:06.992174 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64a609bf309c5f95801c0c58dba5b5f068585e6ae05a5c32475c1454ea220b26"} err="failed to get container status \"64a609bf309c5f95801c0c58dba5b5f068585e6ae05a5c32475c1454ea220b26\": rpc error: code = NotFound desc = could not find container \"64a609bf309c5f95801c0c58dba5b5f068585e6ae05a5c32475c1454ea220b26\": container with ID starting with 64a609bf309c5f95801c0c58dba5b5f068585e6ae05a5c32475c1454ea220b26 not found: ID does not exist" Apr 16 19:29:07.006520 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:07.006496 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn"] Apr 16 19:29:07.007640 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:07.007620 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5lzrjn"] Apr 16 19:29:07.086005 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:07.085970 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6aadfa45-521b-495f-b191-8169463e3cd3" path="/var/lib/kubelet/pods/6aadfa45-521b-495f-b191-8169463e3cd3/volumes" Apr 16 19:29:20.567350 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:20.567317 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-92z9n"] Apr 16 19:29:20.567844 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:20.567590 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6aadfa45-521b-495f-b191-8169463e3cd3" containerName="istio-proxy" Apr 16 19:29:20.567844 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:20.567601 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aadfa45-521b-495f-b191-8169463e3cd3" containerName="istio-proxy" Apr 16 19:29:20.567844 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:20.567674 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="6aadfa45-521b-495f-b191-8169463e3cd3" containerName="istio-proxy" Apr 16 19:29:20.571984 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:20.571965 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-92z9n" Apr 16 19:29:20.574285 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:20.574263 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 19:29:20.574871 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:20.574856 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 19:29:20.574935 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:20.574856 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-bqghd\"" Apr 16 19:29:20.578683 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:20.578659 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-92z9n"] Apr 16 19:29:20.646970 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:20.646938 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv95h\" (UniqueName: \"kubernetes.io/projected/5ee13a32-964c-48de-aebd-4b54107ede14-kube-api-access-bv95h\") pod \"limitador-operator-controller-manager-85c4996f8c-92z9n\" (UID: \"5ee13a32-964c-48de-aebd-4b54107ede14\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-92z9n" Apr 16 19:29:20.748377 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:20.748335 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bv95h\" (UniqueName: \"kubernetes.io/projected/5ee13a32-964c-48de-aebd-4b54107ede14-kube-api-access-bv95h\") pod \"limitador-operator-controller-manager-85c4996f8c-92z9n\" (UID: \"5ee13a32-964c-48de-aebd-4b54107ede14\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-92z9n" Apr 16 19:29:20.768682 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:20.768652 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv95h\" (UniqueName: \"kubernetes.io/projected/5ee13a32-964c-48de-aebd-4b54107ede14-kube-api-access-bv95h\") pod \"limitador-operator-controller-manager-85c4996f8c-92z9n\" (UID: \"5ee13a32-964c-48de-aebd-4b54107ede14\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-92z9n" Apr 16 19:29:20.882592 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:20.882509 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-92z9n" Apr 16 19:29:21.005137 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:21.005114 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-92z9n"] Apr 16 19:29:21.007678 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:29:21.007648 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ee13a32_964c_48de_aebd_4b54107ede14.slice/crio-bdcaaedc2ba5729520f6c5445a17e336b4cb7ccc17d5c4bb222c9389f0f673e0 WatchSource:0}: Error finding container bdcaaedc2ba5729520f6c5445a17e336b4cb7ccc17d5c4bb222c9389f0f673e0: Status 404 returned error can't find the container with id bdcaaedc2ba5729520f6c5445a17e336b4cb7ccc17d5c4bb222c9389f0f673e0 Apr 16 19:29:21.027566 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:21.027537 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-92z9n" event={"ID":"5ee13a32-964c-48de-aebd-4b54107ede14","Type":"ContainerStarted","Data":"bdcaaedc2ba5729520f6c5445a17e336b4cb7ccc17d5c4bb222c9389f0f673e0"} Apr 16 19:29:23.034218 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:23.034174 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-92z9n" event={"ID":"5ee13a32-964c-48de-aebd-4b54107ede14","Type":"ContainerStarted","Data":"39c014c796cf62446d9d024b807070700ed3745fa5e329939984ba98c6abda36"} Apr 16 19:29:23.034577 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:23.034360 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-92z9n" Apr 16 19:29:23.054309 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:23.054258 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-92z9n" podStartSLOduration=1.102232074 podStartE2EDuration="3.054243912s" podCreationTimestamp="2026-04-16 19:29:20 +0000 UTC" firstStartedPulling="2026-04-16 19:29:21.009500864 +0000 UTC m=+668.496698673" lastFinishedPulling="2026-04-16 19:29:22.961512694 +0000 UTC m=+670.448710511" observedRunningTime="2026-04-16 19:29:23.052832139 +0000 UTC m=+670.540029970" watchObservedRunningTime="2026-04-16 19:29:23.054243912 +0000 UTC m=+670.541441809" Apr 16 19:29:27.884377 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:27.884341 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-mqxks"] Apr 16 19:29:27.887705 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:27.887683 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-mqxks" Apr 16 19:29:27.890732 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:27.890713 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 16 19:29:27.891378 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:27.891358 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-zgslr\"" Apr 16 19:29:27.898057 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:27.898031 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-mqxks"] Apr 16 19:29:28.011111 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:28.011073 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp7qr\" (UniqueName: \"kubernetes.io/projected/ed7f38da-c8d3-41f3-8ddf-9aa5d20126d5-kube-api-access-wp7qr\") pod \"dns-operator-controller-manager-648d5c98bc-mqxks\" (UID: \"ed7f38da-c8d3-41f3-8ddf-9aa5d20126d5\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-mqxks" Apr 16 19:29:28.112323 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:28.112280 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wp7qr\" (UniqueName: \"kubernetes.io/projected/ed7f38da-c8d3-41f3-8ddf-9aa5d20126d5-kube-api-access-wp7qr\") pod \"dns-operator-controller-manager-648d5c98bc-mqxks\" (UID: \"ed7f38da-c8d3-41f3-8ddf-9aa5d20126d5\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-mqxks" Apr 16 19:29:28.123518 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:28.123488 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp7qr\" (UniqueName: \"kubernetes.io/projected/ed7f38da-c8d3-41f3-8ddf-9aa5d20126d5-kube-api-access-wp7qr\") pod \"dns-operator-controller-manager-648d5c98bc-mqxks\" (UID: \"ed7f38da-c8d3-41f3-8ddf-9aa5d20126d5\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-mqxks" Apr 16 19:29:28.198912 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:28.198881 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-mqxks" Apr 16 19:29:28.322366 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:28.321742 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-mqxks"] Apr 16 19:29:29.053253 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:29.053220 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-mqxks" event={"ID":"ed7f38da-c8d3-41f3-8ddf-9aa5d20126d5","Type":"ContainerStarted","Data":"4f4685302ba756ad38b717341298038d64ed6eeb2adfad356c4333082bd0deb3"} Apr 16 19:29:31.062449 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:31.062368 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-mqxks" event={"ID":"ed7f38da-c8d3-41f3-8ddf-9aa5d20126d5","Type":"ContainerStarted","Data":"330127d388af094c790e822c5562cf36d1cf28d79513b4e409b4c74c31a7b198"} Apr 16 19:29:31.062881 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:31.062590 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-mqxks" Apr 16 19:29:31.080846 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:31.080781 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-mqxks" podStartSLOduration=1.667807327 podStartE2EDuration="4.080764403s" podCreationTimestamp="2026-04-16 19:29:27 +0000 UTC" firstStartedPulling="2026-04-16 19:29:28.330404571 +0000 UTC m=+675.817602397" lastFinishedPulling="2026-04-16 19:29:30.74336166 +0000 UTC m=+678.230559473" observedRunningTime="2026-04-16 19:29:31.080433143 +0000 UTC m=+678.567630974" watchObservedRunningTime="2026-04-16 19:29:31.080764403 +0000 UTC m=+678.567962236" Apr 16 19:29:32.697211 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:32.697167 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2q5fn"] Apr 16 19:29:32.700492 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:32.700469 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2q5fn" Apr 16 19:29:32.703136 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:32.703119 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-fm64z\"" Apr 16 19:29:32.712745 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:32.712722 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2q5fn"] Apr 16 19:29:32.746611 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:32.746582 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lglt2\" (UniqueName: \"kubernetes.io/projected/2784c663-3d1e-4a68-8c2a-de8440765b77-kube-api-access-lglt2\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-2q5fn\" (UID: \"2784c663-3d1e-4a68-8c2a-de8440765b77\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2q5fn" Apr 16 19:29:32.746760 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:32.746623 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2784c663-3d1e-4a68-8c2a-de8440765b77-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-2q5fn\" (UID: \"2784c663-3d1e-4a68-8c2a-de8440765b77\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2q5fn" Apr 16 19:29:32.847055 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:32.847022 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lglt2\" (UniqueName: \"kubernetes.io/projected/2784c663-3d1e-4a68-8c2a-de8440765b77-kube-api-access-lglt2\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-2q5fn\" (UID: \"2784c663-3d1e-4a68-8c2a-de8440765b77\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2q5fn" Apr 16 19:29:32.847164 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:32.847080 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2784c663-3d1e-4a68-8c2a-de8440765b77-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-2q5fn\" (UID: \"2784c663-3d1e-4a68-8c2a-de8440765b77\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2q5fn" Apr 16 19:29:32.847469 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:32.847450 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2784c663-3d1e-4a68-8c2a-de8440765b77-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-2q5fn\" (UID: \"2784c663-3d1e-4a68-8c2a-de8440765b77\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2q5fn" Apr 16 19:29:32.856002 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:32.855984 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lglt2\" (UniqueName: \"kubernetes.io/projected/2784c663-3d1e-4a68-8c2a-de8440765b77-kube-api-access-lglt2\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-2q5fn\" (UID: \"2784c663-3d1e-4a68-8c2a-de8440765b77\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2q5fn" Apr 16 19:29:33.010547 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:33.010446 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2q5fn" Apr 16 19:29:33.141201 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:33.141024 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2q5fn"] Apr 16 19:29:33.144541 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:29:33.144512 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2784c663_3d1e_4a68_8c2a_de8440765b77.slice/crio-e3d3c4457c89a7d7486c411ff97ad9a762c5b62a298b1fbb8ea777ac6f4f1833 WatchSource:0}: Error finding container e3d3c4457c89a7d7486c411ff97ad9a762c5b62a298b1fbb8ea777ac6f4f1833: Status 404 returned error can't find the container with id e3d3c4457c89a7d7486c411ff97ad9a762c5b62a298b1fbb8ea777ac6f4f1833 Apr 16 19:29:34.040378 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:34.040346 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-92z9n" Apr 16 19:29:34.074151 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:34.074115 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2q5fn" event={"ID":"2784c663-3d1e-4a68-8c2a-de8440765b77","Type":"ContainerStarted","Data":"e3d3c4457c89a7d7486c411ff97ad9a762c5b62a298b1fbb8ea777ac6f4f1833"} Apr 16 19:29:40.092790 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:40.092748 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2q5fn" event={"ID":"2784c663-3d1e-4a68-8c2a-de8440765b77","Type":"ContainerStarted","Data":"df73bc6020175f4c1d4005e2769c5cc8b1978580a0fb19612899af3c472ca9ec"} Apr 16 19:29:40.093270 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:40.092874 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2q5fn" Apr 16 19:29:40.115324 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:40.115263 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2q5fn" podStartSLOduration=1.89067662 podStartE2EDuration="8.115244189s" podCreationTimestamp="2026-04-16 19:29:32 +0000 UTC" firstStartedPulling="2026-04-16 19:29:33.147013305 +0000 UTC m=+680.634211115" lastFinishedPulling="2026-04-16 19:29:39.371580862 +0000 UTC m=+686.858778684" observedRunningTime="2026-04-16 19:29:40.113246616 +0000 UTC m=+687.600444445" watchObservedRunningTime="2026-04-16 19:29:40.115244189 +0000 UTC m=+687.602442022" Apr 16 19:29:42.067881 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:42.067853 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-mqxks" Apr 16 19:29:51.097012 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:51.096984 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2q5fn" Apr 16 19:29:52.022926 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.022895 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2q5fn"] Apr 16 19:29:52.023123 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.023091 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2q5fn" podUID="2784c663-3d1e-4a68-8c2a-de8440765b77" containerName="manager" containerID="cri-o://df73bc6020175f4c1d4005e2769c5cc8b1978580a0fb19612899af3c472ca9ec" gracePeriod=2 Apr 16 19:29:52.037617 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.037592 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2q5fn"] Apr 16 19:29:52.048747 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.048716 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-92z9n"] Apr 16 19:29:52.049069 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.049041 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-92z9n" podUID="5ee13a32-964c-48de-aebd-4b54107ede14" containerName="manager" containerID="cri-o://39c014c796cf62446d9d024b807070700ed3745fa5e329939984ba98c6abda36" gracePeriod=2 Apr 16 19:29:52.059237 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.059177 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-92z9n"] Apr 16 19:29:52.062631 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.062608 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q9cg6"] Apr 16 19:29:52.062966 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.062950 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2784c663-3d1e-4a68-8c2a-de8440765b77" containerName="manager" Apr 16 19:29:52.063052 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.062968 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2784c663-3d1e-4a68-8c2a-de8440765b77" containerName="manager" Apr 16 19:29:52.063052 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.062985 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ee13a32-964c-48de-aebd-4b54107ede14" containerName="manager" Apr 16 19:29:52.063052 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.062993 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ee13a32-964c-48de-aebd-4b54107ede14" containerName="manager" Apr 16 19:29:52.063214 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.063073 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="2784c663-3d1e-4a68-8c2a-de8440765b77" containerName="manager" Apr 16 19:29:52.063214 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.063086 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="5ee13a32-964c-48de-aebd-4b54107ede14" containerName="manager" Apr 16 19:29:52.065952 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.065933 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q9cg6" Apr 16 19:29:52.070270 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.070249 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-p4hdf"] Apr 16 19:29:52.073235 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.073219 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-p4hdf" Apr 16 19:29:52.077064 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.077043 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q9cg6"] Apr 16 19:29:52.088919 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.088894 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-p4hdf"] Apr 16 19:29:52.093389 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.093352 2580 status_manager.go:895] "Failed to get status for pod" podUID="5ee13a32-964c-48de-aebd-4b54107ede14" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-92z9n" err="pods \"limitador-operator-controller-manager-85c4996f8c-92z9n\" is forbidden: User \"system:node:ip-10-0-130-83.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-83.ec2.internal' and this object" Apr 16 19:29:52.097557 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.097528 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w92cr\" (UniqueName: \"kubernetes.io/projected/e89f772c-1c40-4d18-a1d3-e85a7b6e39c9-kube-api-access-w92cr\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-q9cg6\" (UID: \"e89f772c-1c40-4d18-a1d3-e85a7b6e39c9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q9cg6" Apr 16 19:29:52.097875 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.097663 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e89f772c-1c40-4d18-a1d3-e85a7b6e39c9-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-q9cg6\" (UID: \"e89f772c-1c40-4d18-a1d3-e85a7b6e39c9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q9cg6" Apr 16 19:29:52.112776 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.112749 2580 status_manager.go:895] "Failed to get status for pod" podUID="5ee13a32-964c-48de-aebd-4b54107ede14" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-92z9n" err="pods \"limitador-operator-controller-manager-85c4996f8c-92z9n\" is forbidden: User \"system:node:ip-10-0-130-83.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-83.ec2.internal' and this object" Apr 16 19:29:52.199071 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.199037 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e89f772c-1c40-4d18-a1d3-e85a7b6e39c9-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-q9cg6\" (UID: \"e89f772c-1c40-4d18-a1d3-e85a7b6e39c9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q9cg6" Apr 16 19:29:52.199256 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.199106 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w92cr\" (UniqueName: \"kubernetes.io/projected/e89f772c-1c40-4d18-a1d3-e85a7b6e39c9-kube-api-access-w92cr\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-q9cg6\" (UID: \"e89f772c-1c40-4d18-a1d3-e85a7b6e39c9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q9cg6" Apr 16 19:29:52.199256 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.199166 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh66p\" (UniqueName: \"kubernetes.io/projected/55d9ad56-b9ee-4b5b-b3b7-b283c1345cb4-kube-api-access-lh66p\") pod \"limitador-operator-controller-manager-85c4996f8c-p4hdf\" (UID: \"55d9ad56-b9ee-4b5b-b3b7-b283c1345cb4\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-p4hdf" Apr 16 19:29:52.199432 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.199412 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e89f772c-1c40-4d18-a1d3-e85a7b6e39c9-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-q9cg6\" (UID: \"e89f772c-1c40-4d18-a1d3-e85a7b6e39c9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q9cg6" Apr 16 19:29:52.207887 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.207861 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w92cr\" (UniqueName: \"kubernetes.io/projected/e89f772c-1c40-4d18-a1d3-e85a7b6e39c9-kube-api-access-w92cr\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-q9cg6\" (UID: \"e89f772c-1c40-4d18-a1d3-e85a7b6e39c9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q9cg6" Apr 16 19:29:52.277766 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.277706 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2q5fn" Apr 16 19:29:52.279682 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.279656 2580 status_manager.go:895] "Failed to get status for pod" podUID="5ee13a32-964c-48de-aebd-4b54107ede14" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-92z9n" err="pods \"limitador-operator-controller-manager-85c4996f8c-92z9n\" is forbidden: User \"system:node:ip-10-0-130-83.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-83.ec2.internal' and this object" Apr 16 19:29:52.281330 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.281307 2580 status_manager.go:895] "Failed to get status for pod" podUID="2784c663-3d1e-4a68-8c2a-de8440765b77" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2q5fn" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-2q5fn\" is forbidden: User \"system:node:ip-10-0-130-83.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-83.ec2.internal' and this object" Apr 16 19:29:52.283648 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.283630 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-92z9n" Apr 16 19:29:52.285630 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.285586 2580 status_manager.go:895] "Failed to get status for pod" podUID="5ee13a32-964c-48de-aebd-4b54107ede14" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-92z9n" err="pods \"limitador-operator-controller-manager-85c4996f8c-92z9n\" is forbidden: User \"system:node:ip-10-0-130-83.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-83.ec2.internal' and this object" Apr 16 19:29:52.287360 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.287338 2580 status_manager.go:895] "Failed to get status for pod" podUID="2784c663-3d1e-4a68-8c2a-de8440765b77" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2q5fn" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-2q5fn\" is forbidden: User \"system:node:ip-10-0-130-83.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-83.ec2.internal' and this object" Apr 16 19:29:52.299762 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.299731 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lh66p\" (UniqueName: \"kubernetes.io/projected/55d9ad56-b9ee-4b5b-b3b7-b283c1345cb4-kube-api-access-lh66p\") pod \"limitador-operator-controller-manager-85c4996f8c-p4hdf\" (UID: \"55d9ad56-b9ee-4b5b-b3b7-b283c1345cb4\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-p4hdf" Apr 16 19:29:52.311978 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.311957 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh66p\" (UniqueName: \"kubernetes.io/projected/55d9ad56-b9ee-4b5b-b3b7-b283c1345cb4-kube-api-access-lh66p\") pod \"limitador-operator-controller-manager-85c4996f8c-p4hdf\" (UID: \"55d9ad56-b9ee-4b5b-b3b7-b283c1345cb4\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-p4hdf" Apr 16 19:29:52.400267 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.400231 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2784c663-3d1e-4a68-8c2a-de8440765b77-extensions-socket-volume\") pod \"2784c663-3d1e-4a68-8c2a-de8440765b77\" (UID: \"2784c663-3d1e-4a68-8c2a-de8440765b77\") " Apr 16 19:29:52.400435 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.400288 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lglt2\" (UniqueName: \"kubernetes.io/projected/2784c663-3d1e-4a68-8c2a-de8440765b77-kube-api-access-lglt2\") pod \"2784c663-3d1e-4a68-8c2a-de8440765b77\" (UID: \"2784c663-3d1e-4a68-8c2a-de8440765b77\") " Apr 16 19:29:52.400435 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.400353 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bv95h\" (UniqueName: \"kubernetes.io/projected/5ee13a32-964c-48de-aebd-4b54107ede14-kube-api-access-bv95h\") pod \"5ee13a32-964c-48de-aebd-4b54107ede14\" (UID: \"5ee13a32-964c-48de-aebd-4b54107ede14\") " Apr 16 19:29:52.400803 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.400764 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2784c663-3d1e-4a68-8c2a-de8440765b77-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "2784c663-3d1e-4a68-8c2a-de8440765b77" (UID: "2784c663-3d1e-4a68-8c2a-de8440765b77"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:29:52.402489 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.402464 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2784c663-3d1e-4a68-8c2a-de8440765b77-kube-api-access-lglt2" (OuterVolumeSpecName: "kube-api-access-lglt2") pod "2784c663-3d1e-4a68-8c2a-de8440765b77" (UID: "2784c663-3d1e-4a68-8c2a-de8440765b77"). InnerVolumeSpecName "kube-api-access-lglt2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:29:52.402548 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.402489 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ee13a32-964c-48de-aebd-4b54107ede14-kube-api-access-bv95h" (OuterVolumeSpecName: "kube-api-access-bv95h") pod "5ee13a32-964c-48de-aebd-4b54107ede14" (UID: "5ee13a32-964c-48de-aebd-4b54107ede14"). InnerVolumeSpecName "kube-api-access-bv95h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:29:52.467118 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.467082 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q9cg6" Apr 16 19:29:52.472915 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.472885 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-p4hdf" Apr 16 19:29:52.500836 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.500803 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lglt2\" (UniqueName: \"kubernetes.io/projected/2784c663-3d1e-4a68-8c2a-de8440765b77-kube-api-access-lglt2\") on node \"ip-10-0-130-83.ec2.internal\" DevicePath \"\"" Apr 16 19:29:52.500836 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.500831 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bv95h\" (UniqueName: \"kubernetes.io/projected/5ee13a32-964c-48de-aebd-4b54107ede14-kube-api-access-bv95h\") on node \"ip-10-0-130-83.ec2.internal\" DevicePath \"\"" Apr 16 19:29:52.500836 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.500841 2580 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2784c663-3d1e-4a68-8c2a-de8440765b77-extensions-socket-volume\") on node \"ip-10-0-130-83.ec2.internal\" DevicePath \"\"" Apr 16 19:29:52.605778 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.605751 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q9cg6"] Apr 16 19:29:52.608992 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:29:52.608963 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode89f772c_1c40_4d18_a1d3_e85a7b6e39c9.slice/crio-2fbd37be47652c57aceb11b52829794cd0dc0dcdccda7774965ce0792f02bc8e WatchSource:0}: Error finding container 2fbd37be47652c57aceb11b52829794cd0dc0dcdccda7774965ce0792f02bc8e: Status 404 returned error can't find the container with id 2fbd37be47652c57aceb11b52829794cd0dc0dcdccda7774965ce0792f02bc8e Apr 16 19:29:52.624739 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:52.624710 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-p4hdf"] Apr 16 19:29:52.627853 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:29:52.627828 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55d9ad56_b9ee_4b5b_b3b7_b283c1345cb4.slice/crio-90ec44a63d982e1b1e6b891f1cd05b0bb9bbafd880017c5ca1717c2a65a50bab WatchSource:0}: Error finding container 90ec44a63d982e1b1e6b891f1cd05b0bb9bbafd880017c5ca1717c2a65a50bab: Status 404 returned error can't find the container with id 90ec44a63d982e1b1e6b891f1cd05b0bb9bbafd880017c5ca1717c2a65a50bab Apr 16 19:29:53.086833 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:53.086798 2580 status_manager.go:895] "Failed to get status for pod" podUID="2784c663-3d1e-4a68-8c2a-de8440765b77" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2q5fn" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-2q5fn\" is forbidden: User \"system:node:ip-10-0-130-83.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-83.ec2.internal' and this object" Apr 16 19:29:53.087651 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:53.087627 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2784c663-3d1e-4a68-8c2a-de8440765b77" path="/var/lib/kubelet/pods/2784c663-3d1e-4a68-8c2a-de8440765b77/volumes" Apr 16 19:29:53.087939 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:53.087927 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ee13a32-964c-48de-aebd-4b54107ede14" path="/var/lib/kubelet/pods/5ee13a32-964c-48de-aebd-4b54107ede14/volumes" Apr 16 19:29:53.130502 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:53.130467 2580 generic.go:358] "Generic (PLEG): container finished" podID="5ee13a32-964c-48de-aebd-4b54107ede14" containerID="39c014c796cf62446d9d024b807070700ed3745fa5e329939984ba98c6abda36" exitCode=0 Apr 16 19:29:53.130939 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:53.130546 2580 scope.go:117] "RemoveContainer" containerID="39c014c796cf62446d9d024b807070700ed3745fa5e329939984ba98c6abda36" Apr 16 19:29:53.130939 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:53.130560 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-92z9n" Apr 16 19:29:53.131729 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:53.131696 2580 status_manager.go:895] "Failed to get status for pod" podUID="5ee13a32-964c-48de-aebd-4b54107ede14" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-92z9n" err="pods \"limitador-operator-controller-manager-85c4996f8c-92z9n\" is forbidden: User \"system:node:ip-10-0-130-83.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-83.ec2.internal' and this object" Apr 16 19:29:53.133014 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:53.132501 2580 generic.go:358] "Generic (PLEG): container finished" podID="2784c663-3d1e-4a68-8c2a-de8440765b77" containerID="df73bc6020175f4c1d4005e2769c5cc8b1978580a0fb19612899af3c472ca9ec" exitCode=0 Apr 16 19:29:53.133014 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:53.132602 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2q5fn" Apr 16 19:29:53.134746 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:53.134717 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-p4hdf" event={"ID":"55d9ad56-b9ee-4b5b-b3b7-b283c1345cb4","Type":"ContainerStarted","Data":"af832da173fcffa4c8f5dcdc5cd428cdc18aee014b6681ec8b50ab2cd4e0497b"} Apr 16 19:29:53.134868 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:53.134758 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-p4hdf" event={"ID":"55d9ad56-b9ee-4b5b-b3b7-b283c1345cb4","Type":"ContainerStarted","Data":"90ec44a63d982e1b1e6b891f1cd05b0bb9bbafd880017c5ca1717c2a65a50bab"} Apr 16 19:29:53.135491 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:53.135438 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-p4hdf" Apr 16 19:29:53.137026 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:53.136971 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q9cg6" event={"ID":"e89f772c-1c40-4d18-a1d3-e85a7b6e39c9","Type":"ContainerStarted","Data":"dfc505f60e4608e50ada8f0302e3722dedad1162606a247317e85bd71485d572"} Apr 16 19:29:53.137026 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:53.137003 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q9cg6" event={"ID":"e89f772c-1c40-4d18-a1d3-e85a7b6e39c9","Type":"ContainerStarted","Data":"2fbd37be47652c57aceb11b52829794cd0dc0dcdccda7774965ce0792f02bc8e"} Apr 16 19:29:53.137211 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:53.137108 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q9cg6" Apr 16 19:29:53.140116 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:53.140099 2580 scope.go:117] "RemoveContainer" containerID="39c014c796cf62446d9d024b807070700ed3745fa5e329939984ba98c6abda36" Apr 16 19:29:53.140391 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:29:53.140374 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39c014c796cf62446d9d024b807070700ed3745fa5e329939984ba98c6abda36\": container with ID starting with 39c014c796cf62446d9d024b807070700ed3745fa5e329939984ba98c6abda36 not found: ID does not exist" containerID="39c014c796cf62446d9d024b807070700ed3745fa5e329939984ba98c6abda36" Apr 16 19:29:53.140451 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:53.140399 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39c014c796cf62446d9d024b807070700ed3745fa5e329939984ba98c6abda36"} err="failed to get container status \"39c014c796cf62446d9d024b807070700ed3745fa5e329939984ba98c6abda36\": rpc error: code = NotFound desc = could not find container \"39c014c796cf62446d9d024b807070700ed3745fa5e329939984ba98c6abda36\": container with ID starting with 39c014c796cf62446d9d024b807070700ed3745fa5e329939984ba98c6abda36 not found: ID does not exist" Apr 16 19:29:53.140451 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:53.140415 2580 scope.go:117] "RemoveContainer" containerID="df73bc6020175f4c1d4005e2769c5cc8b1978580a0fb19612899af3c472ca9ec" Apr 16 19:29:53.148335 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:53.148320 2580 scope.go:117] "RemoveContainer" containerID="df73bc6020175f4c1d4005e2769c5cc8b1978580a0fb19612899af3c472ca9ec" Apr 16 19:29:53.148593 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:29:53.148572 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df73bc6020175f4c1d4005e2769c5cc8b1978580a0fb19612899af3c472ca9ec\": container with ID starting with df73bc6020175f4c1d4005e2769c5cc8b1978580a0fb19612899af3c472ca9ec not found: ID does not exist" containerID="df73bc6020175f4c1d4005e2769c5cc8b1978580a0fb19612899af3c472ca9ec" Apr 16 19:29:53.148679 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:53.148598 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df73bc6020175f4c1d4005e2769c5cc8b1978580a0fb19612899af3c472ca9ec"} err="failed to get container status \"df73bc6020175f4c1d4005e2769c5cc8b1978580a0fb19612899af3c472ca9ec\": rpc error: code = NotFound desc = could not find container \"df73bc6020175f4c1d4005e2769c5cc8b1978580a0fb19612899af3c472ca9ec\": container with ID starting with df73bc6020175f4c1d4005e2769c5cc8b1978580a0fb19612899af3c472ca9ec not found: ID does not exist" Apr 16 19:29:53.159051 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:53.159012 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-p4hdf" podStartSLOduration=1.159002418 podStartE2EDuration="1.159002418s" podCreationTimestamp="2026-04-16 19:29:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:29:53.157419111 +0000 UTC m=+700.644616942" watchObservedRunningTime="2026-04-16 19:29:53.159002418 +0000 UTC m=+700.646200249" Apr 16 19:29:53.178126 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:29:53.178085 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q9cg6" podStartSLOduration=1.178068382 podStartE2EDuration="1.178068382s" podCreationTimestamp="2026-04-16 19:29:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:29:53.177634708 +0000 UTC m=+700.664832538" watchObservedRunningTime="2026-04-16 19:29:53.178068382 +0000 UTC m=+700.665266215" Apr 16 19:30:04.144808 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:04.144774 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q9cg6" Apr 16 19:30:05.147667 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:05.147642 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-p4hdf" Apr 16 19:30:08.338774 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:08.338740 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q9cg6"] Apr 16 19:30:08.339229 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:08.338967 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q9cg6" podUID="e89f772c-1c40-4d18-a1d3-e85a7b6e39c9" containerName="manager" containerID="cri-o://dfc505f60e4608e50ada8f0302e3722dedad1162606a247317e85bd71485d572" gracePeriod=10 Apr 16 19:30:08.579927 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:08.579904 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q9cg6" Apr 16 19:30:08.737120 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:08.737090 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e89f772c-1c40-4d18-a1d3-e85a7b6e39c9-extensions-socket-volume\") pod \"e89f772c-1c40-4d18-a1d3-e85a7b6e39c9\" (UID: \"e89f772c-1c40-4d18-a1d3-e85a7b6e39c9\") " Apr 16 19:30:08.737306 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:08.737220 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w92cr\" (UniqueName: \"kubernetes.io/projected/e89f772c-1c40-4d18-a1d3-e85a7b6e39c9-kube-api-access-w92cr\") pod \"e89f772c-1c40-4d18-a1d3-e85a7b6e39c9\" (UID: \"e89f772c-1c40-4d18-a1d3-e85a7b6e39c9\") " Apr 16 19:30:08.737548 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:08.737518 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e89f772c-1c40-4d18-a1d3-e85a7b6e39c9-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "e89f772c-1c40-4d18-a1d3-e85a7b6e39c9" (UID: "e89f772c-1c40-4d18-a1d3-e85a7b6e39c9"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:30:08.739150 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:08.739129 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e89f772c-1c40-4d18-a1d3-e85a7b6e39c9-kube-api-access-w92cr" (OuterVolumeSpecName: "kube-api-access-w92cr") pod "e89f772c-1c40-4d18-a1d3-e85a7b6e39c9" (UID: "e89f772c-1c40-4d18-a1d3-e85a7b6e39c9"). InnerVolumeSpecName "kube-api-access-w92cr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:30:08.838334 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:08.838304 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w92cr\" (UniqueName: \"kubernetes.io/projected/e89f772c-1c40-4d18-a1d3-e85a7b6e39c9-kube-api-access-w92cr\") on node \"ip-10-0-130-83.ec2.internal\" DevicePath \"\"" Apr 16 19:30:08.838334 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:08.838330 2580 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e89f772c-1c40-4d18-a1d3-e85a7b6e39c9-extensions-socket-volume\") on node \"ip-10-0-130-83.ec2.internal\" DevicePath \"\"" Apr 16 19:30:09.189409 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:09.189378 2580 generic.go:358] "Generic (PLEG): container finished" podID="e89f772c-1c40-4d18-a1d3-e85a7b6e39c9" containerID="dfc505f60e4608e50ada8f0302e3722dedad1162606a247317e85bd71485d572" exitCode=0 Apr 16 19:30:09.189583 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:09.189441 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q9cg6" Apr 16 19:30:09.189583 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:09.189464 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q9cg6" event={"ID":"e89f772c-1c40-4d18-a1d3-e85a7b6e39c9","Type":"ContainerDied","Data":"dfc505f60e4608e50ada8f0302e3722dedad1162606a247317e85bd71485d572"} Apr 16 19:30:09.189583 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:09.189505 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q9cg6" event={"ID":"e89f772c-1c40-4d18-a1d3-e85a7b6e39c9","Type":"ContainerDied","Data":"2fbd37be47652c57aceb11b52829794cd0dc0dcdccda7774965ce0792f02bc8e"} Apr 16 19:30:09.189583 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:09.189521 2580 scope.go:117] "RemoveContainer" containerID="dfc505f60e4608e50ada8f0302e3722dedad1162606a247317e85bd71485d572" Apr 16 19:30:09.197741 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:09.197722 2580 scope.go:117] "RemoveContainer" containerID="dfc505f60e4608e50ada8f0302e3722dedad1162606a247317e85bd71485d572" Apr 16 19:30:09.198021 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:30:09.197999 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfc505f60e4608e50ada8f0302e3722dedad1162606a247317e85bd71485d572\": container with ID starting with dfc505f60e4608e50ada8f0302e3722dedad1162606a247317e85bd71485d572 not found: ID does not exist" containerID="dfc505f60e4608e50ada8f0302e3722dedad1162606a247317e85bd71485d572" Apr 16 19:30:09.198133 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:09.198034 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfc505f60e4608e50ada8f0302e3722dedad1162606a247317e85bd71485d572"} err="failed to get container status \"dfc505f60e4608e50ada8f0302e3722dedad1162606a247317e85bd71485d572\": rpc error: code = NotFound desc = could not find container \"dfc505f60e4608e50ada8f0302e3722dedad1162606a247317e85bd71485d572\": container with ID starting with dfc505f60e4608e50ada8f0302e3722dedad1162606a247317e85bd71485d572 not found: ID does not exist" Apr 16 19:30:09.205378 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:09.205351 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q9cg6"] Apr 16 19:30:09.209098 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:09.209074 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q9cg6"] Apr 16 19:30:11.086521 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:11.086487 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e89f772c-1c40-4d18-a1d3-e85a7b6e39c9" path="/var/lib/kubelet/pods/e89f772c-1c40-4d18-a1d3-e85a7b6e39c9/volumes" Apr 16 19:30:24.535917 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.535859 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss"] Apr 16 19:30:24.536457 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.536319 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e89f772c-1c40-4d18-a1d3-e85a7b6e39c9" containerName="manager" Apr 16 19:30:24.536457 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.536340 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="e89f772c-1c40-4d18-a1d3-e85a7b6e39c9" containerName="manager" Apr 16 19:30:24.536457 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.536415 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="e89f772c-1c40-4d18-a1d3-e85a7b6e39c9" containerName="manager" Apr 16 19:30:24.541589 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.541566 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss" Apr 16 19:30:24.543976 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.543951 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-kqvbr\"" Apr 16 19:30:24.551648 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.551621 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss"] Apr 16 19:30:24.664136 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.664099 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c5891986-b5a8-4513-9eb2-21ab3a24ee40-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-k5nss\" (UID: \"c5891986-b5a8-4513-9eb2-21ab3a24ee40\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss" Apr 16 19:30:24.664340 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.664143 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcwbf\" (UniqueName: \"kubernetes.io/projected/c5891986-b5a8-4513-9eb2-21ab3a24ee40-kube-api-access-lcwbf\") pod \"maas-default-gateway-openshift-default-58b6f876-k5nss\" (UID: \"c5891986-b5a8-4513-9eb2-21ab3a24ee40\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss" Apr 16 19:30:24.664340 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.664271 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/c5891986-b5a8-4513-9eb2-21ab3a24ee40-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-k5nss\" (UID: \"c5891986-b5a8-4513-9eb2-21ab3a24ee40\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss" Apr 16 19:30:24.664340 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.664334 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/c5891986-b5a8-4513-9eb2-21ab3a24ee40-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-k5nss\" (UID: \"c5891986-b5a8-4513-9eb2-21ab3a24ee40\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss" Apr 16 19:30:24.664457 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.664354 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/c5891986-b5a8-4513-9eb2-21ab3a24ee40-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-k5nss\" (UID: \"c5891986-b5a8-4513-9eb2-21ab3a24ee40\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss" Apr 16 19:30:24.664457 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.664402 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c5891986-b5a8-4513-9eb2-21ab3a24ee40-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-k5nss\" (UID: \"c5891986-b5a8-4513-9eb2-21ab3a24ee40\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss" Apr 16 19:30:24.664457 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.664434 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/c5891986-b5a8-4513-9eb2-21ab3a24ee40-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-k5nss\" (UID: \"c5891986-b5a8-4513-9eb2-21ab3a24ee40\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss" Apr 16 19:30:24.664547 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.664470 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/c5891986-b5a8-4513-9eb2-21ab3a24ee40-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-k5nss\" (UID: \"c5891986-b5a8-4513-9eb2-21ab3a24ee40\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss" Apr 16 19:30:24.664547 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.664486 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/c5891986-b5a8-4513-9eb2-21ab3a24ee40-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-k5nss\" (UID: \"c5891986-b5a8-4513-9eb2-21ab3a24ee40\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss" Apr 16 19:30:24.764869 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.764824 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/c5891986-b5a8-4513-9eb2-21ab3a24ee40-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-k5nss\" (UID: \"c5891986-b5a8-4513-9eb2-21ab3a24ee40\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss" Apr 16 19:30:24.765042 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.764899 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/c5891986-b5a8-4513-9eb2-21ab3a24ee40-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-k5nss\" (UID: \"c5891986-b5a8-4513-9eb2-21ab3a24ee40\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss" Apr 16 19:30:24.765042 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.764928 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/c5891986-b5a8-4513-9eb2-21ab3a24ee40-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-k5nss\" (UID: \"c5891986-b5a8-4513-9eb2-21ab3a24ee40\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss" Apr 16 19:30:24.765042 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.764951 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c5891986-b5a8-4513-9eb2-21ab3a24ee40-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-k5nss\" (UID: \"c5891986-b5a8-4513-9eb2-21ab3a24ee40\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss" Apr 16 19:30:24.765042 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.764979 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/c5891986-b5a8-4513-9eb2-21ab3a24ee40-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-k5nss\" (UID: \"c5891986-b5a8-4513-9eb2-21ab3a24ee40\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss" Apr 16 19:30:24.765042 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.765021 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/c5891986-b5a8-4513-9eb2-21ab3a24ee40-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-k5nss\" (UID: \"c5891986-b5a8-4513-9eb2-21ab3a24ee40\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss" Apr 16 19:30:24.765349 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.765045 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/c5891986-b5a8-4513-9eb2-21ab3a24ee40-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-k5nss\" (UID: \"c5891986-b5a8-4513-9eb2-21ab3a24ee40\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss" Apr 16 19:30:24.765349 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.765075 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c5891986-b5a8-4513-9eb2-21ab3a24ee40-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-k5nss\" (UID: \"c5891986-b5a8-4513-9eb2-21ab3a24ee40\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss" Apr 16 19:30:24.765349 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.765106 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lcwbf\" (UniqueName: \"kubernetes.io/projected/c5891986-b5a8-4513-9eb2-21ab3a24ee40-kube-api-access-lcwbf\") pod \"maas-default-gateway-openshift-default-58b6f876-k5nss\" (UID: \"c5891986-b5a8-4513-9eb2-21ab3a24ee40\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss" Apr 16 19:30:24.765349 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.765306 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/c5891986-b5a8-4513-9eb2-21ab3a24ee40-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-k5nss\" (UID: \"c5891986-b5a8-4513-9eb2-21ab3a24ee40\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss" Apr 16 19:30:24.765557 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.765416 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/c5891986-b5a8-4513-9eb2-21ab3a24ee40-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-k5nss\" (UID: \"c5891986-b5a8-4513-9eb2-21ab3a24ee40\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss" Apr 16 19:30:24.765619 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.765567 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/c5891986-b5a8-4513-9eb2-21ab3a24ee40-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-k5nss\" (UID: \"c5891986-b5a8-4513-9eb2-21ab3a24ee40\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss" Apr 16 19:30:24.765680 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.765623 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/c5891986-b5a8-4513-9eb2-21ab3a24ee40-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-k5nss\" (UID: \"c5891986-b5a8-4513-9eb2-21ab3a24ee40\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss" Apr 16 19:30:24.765729 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.765678 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/c5891986-b5a8-4513-9eb2-21ab3a24ee40-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-k5nss\" (UID: \"c5891986-b5a8-4513-9eb2-21ab3a24ee40\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss" Apr 16 19:30:24.767476 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.767450 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c5891986-b5a8-4513-9eb2-21ab3a24ee40-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-k5nss\" (UID: \"c5891986-b5a8-4513-9eb2-21ab3a24ee40\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss" Apr 16 19:30:24.767711 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.767693 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/c5891986-b5a8-4513-9eb2-21ab3a24ee40-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-k5nss\" (UID: \"c5891986-b5a8-4513-9eb2-21ab3a24ee40\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss" Apr 16 19:30:24.773159 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.773124 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c5891986-b5a8-4513-9eb2-21ab3a24ee40-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-k5nss\" (UID: \"c5891986-b5a8-4513-9eb2-21ab3a24ee40\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss" Apr 16 19:30:24.773302 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.773211 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcwbf\" (UniqueName: \"kubernetes.io/projected/c5891986-b5a8-4513-9eb2-21ab3a24ee40-kube-api-access-lcwbf\") pod \"maas-default-gateway-openshift-default-58b6f876-k5nss\" (UID: \"c5891986-b5a8-4513-9eb2-21ab3a24ee40\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss" Apr 16 19:30:24.854489 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.854365 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss" Apr 16 19:30:24.977396 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.977345 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss"] Apr 16 19:30:24.980141 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:30:24.980109 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5891986_b5a8_4513_9eb2_21ab3a24ee40.slice/crio-15c28e75c3cebebeb0f0ae5111d030c5ccabc4eac491cb5d10b7391c5c7dcf82 WatchSource:0}: Error finding container 15c28e75c3cebebeb0f0ae5111d030c5ccabc4eac491cb5d10b7391c5c7dcf82: Status 404 returned error can't find the container with id 15c28e75c3cebebeb0f0ae5111d030c5ccabc4eac491cb5d10b7391c5c7dcf82 Apr 16 19:30:24.982271 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.982238 2580 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 19:30:24.982393 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.982309 2580 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 19:30:24.982393 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:24.982350 2580 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 19:30:25.243953 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:25.243915 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss" event={"ID":"c5891986-b5a8-4513-9eb2-21ab3a24ee40","Type":"ContainerStarted","Data":"74aa792bd5f94a9f0facb8a63c1301010dc6d83658de33cf767ecc4b0dc6ced4"} Apr 16 19:30:25.243953 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:25.243958 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss" event={"ID":"c5891986-b5a8-4513-9eb2-21ab3a24ee40","Type":"ContainerStarted","Data":"15c28e75c3cebebeb0f0ae5111d030c5ccabc4eac491cb5d10b7391c5c7dcf82"} Apr 16 19:30:25.854696 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:25.854663 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss" Apr 16 19:30:26.859640 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:26.859610 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss" Apr 16 19:30:26.882385 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:26.882338 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss" podStartSLOduration=2.882320835 podStartE2EDuration="2.882320835s" podCreationTimestamp="2026-04-16 19:30:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:30:25.270605452 +0000 UTC m=+732.757803283" watchObservedRunningTime="2026-04-16 19:30:26.882320835 +0000 UTC m=+734.369518668" Apr 16 19:30:27.251877 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:27.251846 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss" Apr 16 19:30:27.252885 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:27.252864 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-k5nss" Apr 16 19:30:28.682924 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:28.682844 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-8jpqw"] Apr 16 19:30:28.686017 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:28.685998 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-8jpqw" Apr 16 19:30:28.688506 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:28.688480 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 19:30:28.688602 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:28.688506 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-c46nd\"" Apr 16 19:30:28.700594 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:28.700573 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-8jpqw"] Apr 16 19:30:28.779632 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:28.779596 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-8jpqw"] Apr 16 19:30:28.801559 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:28.801531 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br898\" (UniqueName: \"kubernetes.io/projected/1c876e05-8fef-479b-a538-c5142f58aca2-kube-api-access-br898\") pod \"limitador-limitador-7d549b5b-8jpqw\" (UID: \"1c876e05-8fef-479b-a538-c5142f58aca2\") " pod="kuadrant-system/limitador-limitador-7d549b5b-8jpqw" Apr 16 19:30:28.801712 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:28.801574 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/1c876e05-8fef-479b-a538-c5142f58aca2-config-file\") pod \"limitador-limitador-7d549b5b-8jpqw\" (UID: \"1c876e05-8fef-479b-a538-c5142f58aca2\") " pod="kuadrant-system/limitador-limitador-7d549b5b-8jpqw" Apr 16 19:30:28.902262 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:28.902220 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-br898\" (UniqueName: \"kubernetes.io/projected/1c876e05-8fef-479b-a538-c5142f58aca2-kube-api-access-br898\") pod \"limitador-limitador-7d549b5b-8jpqw\" (UID: \"1c876e05-8fef-479b-a538-c5142f58aca2\") " pod="kuadrant-system/limitador-limitador-7d549b5b-8jpqw" Apr 16 19:30:28.902456 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:28.902292 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/1c876e05-8fef-479b-a538-c5142f58aca2-config-file\") pod \"limitador-limitador-7d549b5b-8jpqw\" (UID: \"1c876e05-8fef-479b-a538-c5142f58aca2\") " pod="kuadrant-system/limitador-limitador-7d549b5b-8jpqw" Apr 16 19:30:28.903291 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:28.903269 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/1c876e05-8fef-479b-a538-c5142f58aca2-config-file\") pod \"limitador-limitador-7d549b5b-8jpqw\" (UID: \"1c876e05-8fef-479b-a538-c5142f58aca2\") " pod="kuadrant-system/limitador-limitador-7d549b5b-8jpqw" Apr 16 19:30:28.909997 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:28.909974 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-br898\" (UniqueName: \"kubernetes.io/projected/1c876e05-8fef-479b-a538-c5142f58aca2-kube-api-access-br898\") pod \"limitador-limitador-7d549b5b-8jpqw\" (UID: \"1c876e05-8fef-479b-a538-c5142f58aca2\") " pod="kuadrant-system/limitador-limitador-7d549b5b-8jpqw" Apr 16 19:30:28.995918 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:28.995880 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-8jpqw" Apr 16 19:30:29.132631 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:29.132596 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-8jpqw"] Apr 16 19:30:29.135376 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:30:29.135336 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c876e05_8fef_479b_a538_c5142f58aca2.slice/crio-fbfe66b1c99264ec6d0a5088c81505cf9f14e3901dd7ef706ed1fe118bb7c20a WatchSource:0}: Error finding container fbfe66b1c99264ec6d0a5088c81505cf9f14e3901dd7ef706ed1fe118bb7c20a: Status 404 returned error can't find the container with id fbfe66b1c99264ec6d0a5088c81505cf9f14e3901dd7ef706ed1fe118bb7c20a Apr 16 19:30:29.257530 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:29.257454 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-8jpqw" event={"ID":"1c876e05-8fef-479b-a538-c5142f58aca2","Type":"ContainerStarted","Data":"fbfe66b1c99264ec6d0a5088c81505cf9f14e3901dd7ef706ed1fe118bb7c20a"} Apr 16 19:30:29.533712 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:29.533628 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-t2klc"] Apr 16 19:30:29.538256 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:29.538234 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-t2klc" Apr 16 19:30:29.540672 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:29.540651 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-ffvgr\"" Apr 16 19:30:29.545148 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:29.545124 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-t2klc"] Apr 16 19:30:29.709585 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:29.709552 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8fz4\" (UniqueName: \"kubernetes.io/projected/54f16fce-7c8a-4710-bcf1-a3a4c597d7c9-kube-api-access-m8fz4\") pod \"authorino-f99f4b5cd-t2klc\" (UID: \"54f16fce-7c8a-4710-bcf1-a3a4c597d7c9\") " pod="kuadrant-system/authorino-f99f4b5cd-t2klc" Apr 16 19:30:29.743973 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:29.743939 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-jp8hr"] Apr 16 19:30:29.747043 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:29.747019 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-jp8hr" Apr 16 19:30:29.754328 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:29.753890 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-jp8hr"] Apr 16 19:30:29.810499 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:29.810413 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m8fz4\" (UniqueName: \"kubernetes.io/projected/54f16fce-7c8a-4710-bcf1-a3a4c597d7c9-kube-api-access-m8fz4\") pod \"authorino-f99f4b5cd-t2klc\" (UID: \"54f16fce-7c8a-4710-bcf1-a3a4c597d7c9\") " pod="kuadrant-system/authorino-f99f4b5cd-t2klc" Apr 16 19:30:29.819860 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:29.819836 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8fz4\" (UniqueName: \"kubernetes.io/projected/54f16fce-7c8a-4710-bcf1-a3a4c597d7c9-kube-api-access-m8fz4\") pod \"authorino-f99f4b5cd-t2klc\" (UID: \"54f16fce-7c8a-4710-bcf1-a3a4c597d7c9\") " pod="kuadrant-system/authorino-f99f4b5cd-t2klc" Apr 16 19:30:29.849661 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:29.849620 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-t2klc" Apr 16 19:30:29.911794 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:29.911730 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n24kx\" (UniqueName: \"kubernetes.io/projected/082eb313-d0d8-45e0-81df-67ca16b31c25-kube-api-access-n24kx\") pod \"authorino-7498df8756-jp8hr\" (UID: \"082eb313-d0d8-45e0-81df-67ca16b31c25\") " pod="kuadrant-system/authorino-7498df8756-jp8hr" Apr 16 19:30:30.012891 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:30.012847 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n24kx\" (UniqueName: \"kubernetes.io/projected/082eb313-d0d8-45e0-81df-67ca16b31c25-kube-api-access-n24kx\") pod \"authorino-7498df8756-jp8hr\" (UID: \"082eb313-d0d8-45e0-81df-67ca16b31c25\") " pod="kuadrant-system/authorino-7498df8756-jp8hr" Apr 16 19:30:30.024999 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:30.024963 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-t2klc"] Apr 16 19:30:30.028985 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:30:30.028946 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54f16fce_7c8a_4710_bcf1_a3a4c597d7c9.slice/crio-d6597624a2087a65d3f3feac5c9740bcea2f146d2d903ecd65a029a2299d2fe9 WatchSource:0}: Error finding container d6597624a2087a65d3f3feac5c9740bcea2f146d2d903ecd65a029a2299d2fe9: Status 404 returned error can't find the container with id d6597624a2087a65d3f3feac5c9740bcea2f146d2d903ecd65a029a2299d2fe9 Apr 16 19:30:30.036480 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:30.036437 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n24kx\" (UniqueName: \"kubernetes.io/projected/082eb313-d0d8-45e0-81df-67ca16b31c25-kube-api-access-n24kx\") pod \"authorino-7498df8756-jp8hr\" (UID: \"082eb313-d0d8-45e0-81df-67ca16b31c25\") " pod="kuadrant-system/authorino-7498df8756-jp8hr" Apr 16 19:30:30.057115 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:30.057081 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-jp8hr" Apr 16 19:30:30.236105 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:30.236046 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-jp8hr"] Apr 16 19:30:30.239139 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:30:30.239108 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod082eb313_d0d8_45e0_81df_67ca16b31c25.slice/crio-9140e2b885b22f2d032b7c773b79bcae72335ab6492277d92eaf67828b9a9f09 WatchSource:0}: Error finding container 9140e2b885b22f2d032b7c773b79bcae72335ab6492277d92eaf67828b9a9f09: Status 404 returned error can't find the container with id 9140e2b885b22f2d032b7c773b79bcae72335ab6492277d92eaf67828b9a9f09 Apr 16 19:30:30.261950 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:30.261912 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-t2klc" event={"ID":"54f16fce-7c8a-4710-bcf1-a3a4c597d7c9","Type":"ContainerStarted","Data":"d6597624a2087a65d3f3feac5c9740bcea2f146d2d903ecd65a029a2299d2fe9"} Apr 16 19:30:30.263213 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:30.263162 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-jp8hr" event={"ID":"082eb313-d0d8-45e0-81df-67ca16b31c25","Type":"ContainerStarted","Data":"9140e2b885b22f2d032b7c773b79bcae72335ab6492277d92eaf67828b9a9f09"} Apr 16 19:30:34.279362 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:34.279317 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-t2klc" event={"ID":"54f16fce-7c8a-4710-bcf1-a3a4c597d7c9","Type":"ContainerStarted","Data":"1a469b1e5c3b6929779078a9cb777e96a5917b3bfb26a4473198e96a3061ccd9"} Apr 16 19:30:34.280726 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:34.280698 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-8jpqw" event={"ID":"1c876e05-8fef-479b-a538-c5142f58aca2","Type":"ContainerStarted","Data":"ea35bcdfa079f98d9a51bcd07dba5107fe5758c2a2e943833b7dc03eb876da18"} Apr 16 19:30:34.280861 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:34.280790 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-8jpqw" Apr 16 19:30:34.281928 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:34.281906 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-jp8hr" event={"ID":"082eb313-d0d8-45e0-81df-67ca16b31c25","Type":"ContainerStarted","Data":"4fa6326ec17fff566eaed2c33b0cb54e3503164fe74d1b2994d8ce88c0e4f773"} Apr 16 19:30:34.295197 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:34.295142 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-t2klc" podStartSLOduration=2.052638031 podStartE2EDuration="5.295126708s" podCreationTimestamp="2026-04-16 19:30:29 +0000 UTC" firstStartedPulling="2026-04-16 19:30:30.031140919 +0000 UTC m=+737.518338735" lastFinishedPulling="2026-04-16 19:30:33.273629599 +0000 UTC m=+740.760827412" observedRunningTime="2026-04-16 19:30:34.294020375 +0000 UTC m=+741.781218207" watchObservedRunningTime="2026-04-16 19:30:34.295126708 +0000 UTC m=+741.782324541" Apr 16 19:30:34.324753 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:34.324701 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-8jpqw" podStartSLOduration=2.1313190730000002 podStartE2EDuration="6.324682477s" podCreationTimestamp="2026-04-16 19:30:28 +0000 UTC" firstStartedPulling="2026-04-16 19:30:29.137161101 +0000 UTC m=+736.624358911" lastFinishedPulling="2026-04-16 19:30:33.330524492 +0000 UTC m=+740.817722315" observedRunningTime="2026-04-16 19:30:34.323861497 +0000 UTC m=+741.811059331" watchObservedRunningTime="2026-04-16 19:30:34.324682477 +0000 UTC m=+741.811880309" Apr 16 19:30:34.324939 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:34.324796 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-jp8hr" podStartSLOduration=2.291949976 podStartE2EDuration="5.324788624s" podCreationTimestamp="2026-04-16 19:30:29 +0000 UTC" firstStartedPulling="2026-04-16 19:30:30.240725816 +0000 UTC m=+737.727923627" lastFinishedPulling="2026-04-16 19:30:33.27356445 +0000 UTC m=+740.760762275" observedRunningTime="2026-04-16 19:30:34.308732542 +0000 UTC m=+741.795930374" watchObservedRunningTime="2026-04-16 19:30:34.324788624 +0000 UTC m=+741.811986457" Apr 16 19:30:34.339518 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:34.339486 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-t2klc"] Apr 16 19:30:36.288061 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:36.288022 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-t2klc" podUID="54f16fce-7c8a-4710-bcf1-a3a4c597d7c9" containerName="authorino" containerID="cri-o://1a469b1e5c3b6929779078a9cb777e96a5917b3bfb26a4473198e96a3061ccd9" gracePeriod=30 Apr 16 19:30:36.529249 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:36.529225 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-t2klc" Apr 16 19:30:36.572975 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:36.572903 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8fz4\" (UniqueName: \"kubernetes.io/projected/54f16fce-7c8a-4710-bcf1-a3a4c597d7c9-kube-api-access-m8fz4\") pod \"54f16fce-7c8a-4710-bcf1-a3a4c597d7c9\" (UID: \"54f16fce-7c8a-4710-bcf1-a3a4c597d7c9\") " Apr 16 19:30:36.574875 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:36.574848 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54f16fce-7c8a-4710-bcf1-a3a4c597d7c9-kube-api-access-m8fz4" (OuterVolumeSpecName: "kube-api-access-m8fz4") pod "54f16fce-7c8a-4710-bcf1-a3a4c597d7c9" (UID: "54f16fce-7c8a-4710-bcf1-a3a4c597d7c9"). InnerVolumeSpecName "kube-api-access-m8fz4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:30:36.673692 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:36.673660 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m8fz4\" (UniqueName: \"kubernetes.io/projected/54f16fce-7c8a-4710-bcf1-a3a4c597d7c9-kube-api-access-m8fz4\") on node \"ip-10-0-130-83.ec2.internal\" DevicePath \"\"" Apr 16 19:30:37.291615 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:37.291578 2580 generic.go:358] "Generic (PLEG): container finished" podID="54f16fce-7c8a-4710-bcf1-a3a4c597d7c9" containerID="1a469b1e5c3b6929779078a9cb777e96a5917b3bfb26a4473198e96a3061ccd9" exitCode=0 Apr 16 19:30:37.292054 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:37.291633 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-t2klc" Apr 16 19:30:37.292054 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:37.291669 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-t2klc" event={"ID":"54f16fce-7c8a-4710-bcf1-a3a4c597d7c9","Type":"ContainerDied","Data":"1a469b1e5c3b6929779078a9cb777e96a5917b3bfb26a4473198e96a3061ccd9"} Apr 16 19:30:37.292054 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:37.291721 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-t2klc" event={"ID":"54f16fce-7c8a-4710-bcf1-a3a4c597d7c9","Type":"ContainerDied","Data":"d6597624a2087a65d3f3feac5c9740bcea2f146d2d903ecd65a029a2299d2fe9"} Apr 16 19:30:37.292054 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:37.291742 2580 scope.go:117] "RemoveContainer" containerID="1a469b1e5c3b6929779078a9cb777e96a5917b3bfb26a4473198e96a3061ccd9" Apr 16 19:30:37.299246 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:37.299229 2580 scope.go:117] "RemoveContainer" containerID="1a469b1e5c3b6929779078a9cb777e96a5917b3bfb26a4473198e96a3061ccd9" Apr 16 19:30:37.299489 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:30:37.299464 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a469b1e5c3b6929779078a9cb777e96a5917b3bfb26a4473198e96a3061ccd9\": container with ID starting with 1a469b1e5c3b6929779078a9cb777e96a5917b3bfb26a4473198e96a3061ccd9 not found: ID does not exist" containerID="1a469b1e5c3b6929779078a9cb777e96a5917b3bfb26a4473198e96a3061ccd9" Apr 16 19:30:37.299545 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:37.299494 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a469b1e5c3b6929779078a9cb777e96a5917b3bfb26a4473198e96a3061ccd9"} err="failed to get container status \"1a469b1e5c3b6929779078a9cb777e96a5917b3bfb26a4473198e96a3061ccd9\": rpc error: code = NotFound desc = could not find container \"1a469b1e5c3b6929779078a9cb777e96a5917b3bfb26a4473198e96a3061ccd9\": container with ID starting with 1a469b1e5c3b6929779078a9cb777e96a5917b3bfb26a4473198e96a3061ccd9 not found: ID does not exist" Apr 16 19:30:37.309118 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:37.309090 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-t2klc"] Apr 16 19:30:37.311649 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:37.311629 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-t2klc"] Apr 16 19:30:39.087158 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:39.087125 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54f16fce-7c8a-4710-bcf1-a3a4c597d7c9" path="/var/lib/kubelet/pods/54f16fce-7c8a-4710-bcf1-a3a4c597d7c9/volumes" Apr 16 19:30:43.773840 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:43.773807 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-8jpqw"] Apr 16 19:30:43.774305 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:43.774059 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-8jpqw" podUID="1c876e05-8fef-479b-a538-c5142f58aca2" containerName="limitador" containerID="cri-o://ea35bcdfa079f98d9a51bcd07dba5107fe5758c2a2e943833b7dc03eb876da18" gracePeriod=30 Apr 16 19:30:43.774760 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:43.774669 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-8jpqw" Apr 16 19:30:44.314881 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:44.314853 2580 generic.go:358] "Generic (PLEG): container finished" podID="1c876e05-8fef-479b-a538-c5142f58aca2" containerID="ea35bcdfa079f98d9a51bcd07dba5107fe5758c2a2e943833b7dc03eb876da18" exitCode=0 Apr 16 19:30:44.314995 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:44.314920 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-8jpqw" event={"ID":"1c876e05-8fef-479b-a538-c5142f58aca2","Type":"ContainerDied","Data":"ea35bcdfa079f98d9a51bcd07dba5107fe5758c2a2e943833b7dc03eb876da18"} Apr 16 19:30:44.328948 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:44.328927 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-8jpqw" Apr 16 19:30:44.434509 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:44.434424 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/1c876e05-8fef-479b-a538-c5142f58aca2-config-file\") pod \"1c876e05-8fef-479b-a538-c5142f58aca2\" (UID: \"1c876e05-8fef-479b-a538-c5142f58aca2\") " Apr 16 19:30:44.434509 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:44.434483 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-br898\" (UniqueName: \"kubernetes.io/projected/1c876e05-8fef-479b-a538-c5142f58aca2-kube-api-access-br898\") pod \"1c876e05-8fef-479b-a538-c5142f58aca2\" (UID: \"1c876e05-8fef-479b-a538-c5142f58aca2\") " Apr 16 19:30:44.434788 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:44.434763 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c876e05-8fef-479b-a538-c5142f58aca2-config-file" (OuterVolumeSpecName: "config-file") pod "1c876e05-8fef-479b-a538-c5142f58aca2" (UID: "1c876e05-8fef-479b-a538-c5142f58aca2"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:30:44.436615 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:44.436588 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c876e05-8fef-479b-a538-c5142f58aca2-kube-api-access-br898" (OuterVolumeSpecName: "kube-api-access-br898") pod "1c876e05-8fef-479b-a538-c5142f58aca2" (UID: "1c876e05-8fef-479b-a538-c5142f58aca2"). InnerVolumeSpecName "kube-api-access-br898". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:30:44.535248 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:44.535211 2580 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/1c876e05-8fef-479b-a538-c5142f58aca2-config-file\") on node \"ip-10-0-130-83.ec2.internal\" DevicePath \"\"" Apr 16 19:30:44.535248 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:44.535246 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-br898\" (UniqueName: \"kubernetes.io/projected/1c876e05-8fef-479b-a538-c5142f58aca2-kube-api-access-br898\") on node \"ip-10-0-130-83.ec2.internal\" DevicePath \"\"" Apr 16 19:30:44.714948 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:44.714917 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-hlpc9"] Apr 16 19:30:44.715246 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:44.715230 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c876e05-8fef-479b-a538-c5142f58aca2" containerName="limitador" Apr 16 19:30:44.715246 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:44.715245 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c876e05-8fef-479b-a538-c5142f58aca2" containerName="limitador" Apr 16 19:30:44.715396 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:44.715260 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="54f16fce-7c8a-4710-bcf1-a3a4c597d7c9" containerName="authorino" Apr 16 19:30:44.715396 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:44.715266 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="54f16fce-7c8a-4710-bcf1-a3a4c597d7c9" containerName="authorino" Apr 16 19:30:44.715396 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:44.715315 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="54f16fce-7c8a-4710-bcf1-a3a4c597d7c9" containerName="authorino" Apr 16 19:30:44.715396 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:44.715322 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="1c876e05-8fef-479b-a538-c5142f58aca2" containerName="limitador" Apr 16 19:30:44.718425 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:44.718409 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-hlpc9" Apr 16 19:30:44.720840 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:44.720819 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 16 19:30:44.721050 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:44.721035 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-stqz9\"" Apr 16 19:30:44.728138 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:44.728115 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-hlpc9"] Apr 16 19:30:44.837519 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:44.837488 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/63e1aaf6-0990-425e-87c6-bacb2d2b4237-data\") pod \"postgres-868db5846d-hlpc9\" (UID: \"63e1aaf6-0990-425e-87c6-bacb2d2b4237\") " pod="opendatahub/postgres-868db5846d-hlpc9" Apr 16 19:30:44.837876 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:44.837526 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf795\" (UniqueName: \"kubernetes.io/projected/63e1aaf6-0990-425e-87c6-bacb2d2b4237-kube-api-access-wf795\") pod \"postgres-868db5846d-hlpc9\" (UID: \"63e1aaf6-0990-425e-87c6-bacb2d2b4237\") " pod="opendatahub/postgres-868db5846d-hlpc9" Apr 16 19:30:44.938363 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:44.938304 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/63e1aaf6-0990-425e-87c6-bacb2d2b4237-data\") pod \"postgres-868db5846d-hlpc9\" (UID: \"63e1aaf6-0990-425e-87c6-bacb2d2b4237\") " pod="opendatahub/postgres-868db5846d-hlpc9" Apr 16 19:30:44.938363 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:44.938362 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wf795\" (UniqueName: \"kubernetes.io/projected/63e1aaf6-0990-425e-87c6-bacb2d2b4237-kube-api-access-wf795\") pod \"postgres-868db5846d-hlpc9\" (UID: \"63e1aaf6-0990-425e-87c6-bacb2d2b4237\") " pod="opendatahub/postgres-868db5846d-hlpc9" Apr 16 19:30:44.938714 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:44.938693 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/63e1aaf6-0990-425e-87c6-bacb2d2b4237-data\") pod \"postgres-868db5846d-hlpc9\" (UID: \"63e1aaf6-0990-425e-87c6-bacb2d2b4237\") " pod="opendatahub/postgres-868db5846d-hlpc9" Apr 16 19:30:44.947518 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:44.947486 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf795\" (UniqueName: \"kubernetes.io/projected/63e1aaf6-0990-425e-87c6-bacb2d2b4237-kube-api-access-wf795\") pod \"postgres-868db5846d-hlpc9\" (UID: \"63e1aaf6-0990-425e-87c6-bacb2d2b4237\") " pod="opendatahub/postgres-868db5846d-hlpc9" Apr 16 19:30:45.029542 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:45.029450 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-hlpc9" Apr 16 19:30:45.151699 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:45.151676 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-hlpc9"] Apr 16 19:30:45.154548 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:30:45.154518 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63e1aaf6_0990_425e_87c6_bacb2d2b4237.slice/crio-6251aec0114281519beb434d39b6cb3ce919ad21078fd37c7e3850e38895205d WatchSource:0}: Error finding container 6251aec0114281519beb434d39b6cb3ce919ad21078fd37c7e3850e38895205d: Status 404 returned error can't find the container with id 6251aec0114281519beb434d39b6cb3ce919ad21078fd37c7e3850e38895205d Apr 16 19:30:45.319664 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:45.319564 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-8jpqw" event={"ID":"1c876e05-8fef-479b-a538-c5142f58aca2","Type":"ContainerDied","Data":"fbfe66b1c99264ec6d0a5088c81505cf9f14e3901dd7ef706ed1fe118bb7c20a"} Apr 16 19:30:45.319664 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:45.319611 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-8jpqw" Apr 16 19:30:45.319664 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:45.319623 2580 scope.go:117] "RemoveContainer" containerID="ea35bcdfa079f98d9a51bcd07dba5107fe5758c2a2e943833b7dc03eb876da18" Apr 16 19:30:45.320707 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:45.320680 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-hlpc9" event={"ID":"63e1aaf6-0990-425e-87c6-bacb2d2b4237","Type":"ContainerStarted","Data":"6251aec0114281519beb434d39b6cb3ce919ad21078fd37c7e3850e38895205d"} Apr 16 19:30:45.335488 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:45.335455 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-8jpqw"] Apr 16 19:30:45.340296 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:45.340271 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-8jpqw"] Apr 16 19:30:47.088345 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:47.088309 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c876e05-8fef-479b-a538-c5142f58aca2" path="/var/lib/kubelet/pods/1c876e05-8fef-479b-a538-c5142f58aca2/volumes" Apr 16 19:30:51.345697 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:51.345659 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-hlpc9" event={"ID":"63e1aaf6-0990-425e-87c6-bacb2d2b4237","Type":"ContainerStarted","Data":"b6bf626e8a3687b59a5abdce93c560722564c9fa591ac351fc0289ddf388f1fc"} Apr 16 19:30:51.346214 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:51.345808 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-hlpc9" Apr 16 19:30:51.362523 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:51.362471 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-hlpc9" podStartSLOduration=2.216111694 podStartE2EDuration="7.36245244s" podCreationTimestamp="2026-04-16 19:30:44 +0000 UTC" firstStartedPulling="2026-04-16 19:30:45.156037776 +0000 UTC m=+752.643235593" lastFinishedPulling="2026-04-16 19:30:50.302378529 +0000 UTC m=+757.789576339" observedRunningTime="2026-04-16 19:30:51.360891243 +0000 UTC m=+758.848089075" watchObservedRunningTime="2026-04-16 19:30:51.36245244 +0000 UTC m=+758.849650275" Apr 16 19:30:57.377377 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:57.377343 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-hlpc9" Apr 16 19:30:58.229244 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:58.229209 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-868c997c74-gjwrm"] Apr 16 19:30:58.232221 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:58.232177 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-868c997c74-gjwrm" Apr 16 19:30:58.235141 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:58.235117 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 16 19:30:58.235327 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:58.235176 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-cczkd\"" Apr 16 19:30:58.235511 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:58.235235 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 16 19:30:58.244680 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:58.244654 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-868c997c74-gjwrm"] Apr 16 19:30:58.353558 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:58.353525 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkstj\" (UniqueName: \"kubernetes.io/projected/a73ff427-3c2f-49ce-8ff9-259f21021b2a-kube-api-access-qkstj\") pod \"maas-api-868c997c74-gjwrm\" (UID: \"a73ff427-3c2f-49ce-8ff9-259f21021b2a\") " pod="opendatahub/maas-api-868c997c74-gjwrm" Apr 16 19:30:58.353745 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:58.353604 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/a73ff427-3c2f-49ce-8ff9-259f21021b2a-maas-api-tls\") pod \"maas-api-868c997c74-gjwrm\" (UID: \"a73ff427-3c2f-49ce-8ff9-259f21021b2a\") " pod="opendatahub/maas-api-868c997c74-gjwrm" Apr 16 19:30:58.455106 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:58.455047 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/a73ff427-3c2f-49ce-8ff9-259f21021b2a-maas-api-tls\") pod \"maas-api-868c997c74-gjwrm\" (UID: \"a73ff427-3c2f-49ce-8ff9-259f21021b2a\") " pod="opendatahub/maas-api-868c997c74-gjwrm" Apr 16 19:30:58.455611 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:58.455155 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkstj\" (UniqueName: \"kubernetes.io/projected/a73ff427-3c2f-49ce-8ff9-259f21021b2a-kube-api-access-qkstj\") pod \"maas-api-868c997c74-gjwrm\" (UID: \"a73ff427-3c2f-49ce-8ff9-259f21021b2a\") " pod="opendatahub/maas-api-868c997c74-gjwrm" Apr 16 19:30:58.455611 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:30:58.455228 2580 secret.go:189] Couldn't get secret opendatahub/maas-api-serving-cert: secret "maas-api-serving-cert" not found Apr 16 19:30:58.455611 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:30:58.455305 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a73ff427-3c2f-49ce-8ff9-259f21021b2a-maas-api-tls podName:a73ff427-3c2f-49ce-8ff9-259f21021b2a nodeName:}" failed. No retries permitted until 2026-04-16 19:30:58.955284526 +0000 UTC m=+766.442482339 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "maas-api-tls" (UniqueName: "kubernetes.io/secret/a73ff427-3c2f-49ce-8ff9-259f21021b2a-maas-api-tls") pod "maas-api-868c997c74-gjwrm" (UID: "a73ff427-3c2f-49ce-8ff9-259f21021b2a") : secret "maas-api-serving-cert" not found Apr 16 19:30:58.468797 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:58.468765 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkstj\" (UniqueName: \"kubernetes.io/projected/a73ff427-3c2f-49ce-8ff9-259f21021b2a-kube-api-access-qkstj\") pod \"maas-api-868c997c74-gjwrm\" (UID: \"a73ff427-3c2f-49ce-8ff9-259f21021b2a\") " pod="opendatahub/maas-api-868c997c74-gjwrm" Apr 16 19:30:58.960115 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:58.960079 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/a73ff427-3c2f-49ce-8ff9-259f21021b2a-maas-api-tls\") pod \"maas-api-868c997c74-gjwrm\" (UID: \"a73ff427-3c2f-49ce-8ff9-259f21021b2a\") " pod="opendatahub/maas-api-868c997c74-gjwrm" Apr 16 19:30:58.962655 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:58.962621 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/a73ff427-3c2f-49ce-8ff9-259f21021b2a-maas-api-tls\") pod \"maas-api-868c997c74-gjwrm\" (UID: \"a73ff427-3c2f-49ce-8ff9-259f21021b2a\") " pod="opendatahub/maas-api-868c997c74-gjwrm" Apr 16 19:30:59.000783 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:59.000753 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-nv4sr"] Apr 16 19:30:59.003802 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:59.003782 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-nv4sr" Apr 16 19:30:59.019909 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:59.019884 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-nv4sr"] Apr 16 19:30:59.060984 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:59.060951 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f92m7\" (UniqueName: \"kubernetes.io/projected/0425eb24-ec48-4ffd-863d-51135aa5ce39-kube-api-access-f92m7\") pod \"authorino-8b475cf9f-nv4sr\" (UID: \"0425eb24-ec48-4ffd-863d-51135aa5ce39\") " pod="kuadrant-system/authorino-8b475cf9f-nv4sr" Apr 16 19:30:59.146810 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:59.146770 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-868c997c74-gjwrm" Apr 16 19:30:59.162056 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:59.161998 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f92m7\" (UniqueName: \"kubernetes.io/projected/0425eb24-ec48-4ffd-863d-51135aa5ce39-kube-api-access-f92m7\") pod \"authorino-8b475cf9f-nv4sr\" (UID: \"0425eb24-ec48-4ffd-863d-51135aa5ce39\") " pod="kuadrant-system/authorino-8b475cf9f-nv4sr" Apr 16 19:30:59.172292 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:59.172234 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f92m7\" (UniqueName: \"kubernetes.io/projected/0425eb24-ec48-4ffd-863d-51135aa5ce39-kube-api-access-f92m7\") pod \"authorino-8b475cf9f-nv4sr\" (UID: \"0425eb24-ec48-4ffd-863d-51135aa5ce39\") " pod="kuadrant-system/authorino-8b475cf9f-nv4sr" Apr 16 19:30:59.313758 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:59.313719 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-nv4sr" Apr 16 19:30:59.358650 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:59.358567 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-nv4sr"] Apr 16 19:30:59.360203 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:59.360140 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-868c997c74-gjwrm"] Apr 16 19:30:59.364085 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:30:59.363986 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda73ff427_3c2f_49ce_8ff9_259f21021b2a.slice/crio-bc96bb19fd0a341ba603357abe5da2b15045b833d9b037186634fa1f5e36d45c WatchSource:0}: Error finding container bc96bb19fd0a341ba603357abe5da2b15045b833d9b037186634fa1f5e36d45c: Status 404 returned error can't find the container with id bc96bb19fd0a341ba603357abe5da2b15045b833d9b037186634fa1f5e36d45c Apr 16 19:30:59.372370 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:59.372342 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-868c997c74-gjwrm" event={"ID":"a73ff427-3c2f-49ce-8ff9-259f21021b2a","Type":"ContainerStarted","Data":"bc96bb19fd0a341ba603357abe5da2b15045b833d9b037186634fa1f5e36d45c"} Apr 16 19:30:59.406750 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:59.406715 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7695c5c68-tr8x9"] Apr 16 19:30:59.411518 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:59.411492 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7695c5c68-tr8x9" Apr 16 19:30:59.430081 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:59.430007 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7695c5c68-tr8x9"] Apr 16 19:30:59.467723 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:59.467688 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2xsf\" (UniqueName: \"kubernetes.io/projected/c90c1148-408b-417a-a2ef-abd66d936ee6-kube-api-access-l2xsf\") pod \"authorino-7695c5c68-tr8x9\" (UID: \"c90c1148-408b-417a-a2ef-abd66d936ee6\") " pod="kuadrant-system/authorino-7695c5c68-tr8x9" Apr 16 19:30:59.514837 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:59.514806 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-nv4sr"] Apr 16 19:30:59.516137 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:30:59.516114 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0425eb24_ec48_4ffd_863d_51135aa5ce39.slice/crio-adff254a945685574859962d37c83ca7e2d36252dcb81500579be28ef9a95719 WatchSource:0}: Error finding container adff254a945685574859962d37c83ca7e2d36252dcb81500579be28ef9a95719: Status 404 returned error can't find the container with id adff254a945685574859962d37c83ca7e2d36252dcb81500579be28ef9a95719 Apr 16 19:30:59.568732 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:59.568690 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l2xsf\" (UniqueName: \"kubernetes.io/projected/c90c1148-408b-417a-a2ef-abd66d936ee6-kube-api-access-l2xsf\") pod \"authorino-7695c5c68-tr8x9\" (UID: \"c90c1148-408b-417a-a2ef-abd66d936ee6\") " pod="kuadrant-system/authorino-7695c5c68-tr8x9" Apr 16 19:30:59.581863 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:59.581837 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2xsf\" (UniqueName: \"kubernetes.io/projected/c90c1148-408b-417a-a2ef-abd66d936ee6-kube-api-access-l2xsf\") pod \"authorino-7695c5c68-tr8x9\" (UID: \"c90c1148-408b-417a-a2ef-abd66d936ee6\") " pod="kuadrant-system/authorino-7695c5c68-tr8x9" Apr 16 19:30:59.694403 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:59.694365 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7695c5c68-tr8x9"] Apr 16 19:30:59.694616 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:59.694601 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7695c5c68-tr8x9" Apr 16 19:30:59.728138 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:59.728104 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-5dd77fbc76-v9bt2"] Apr 16 19:30:59.733453 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:59.733425 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5dd77fbc76-v9bt2" Apr 16 19:30:59.740172 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:59.740146 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 16 19:30:59.754539 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:59.751452 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-5dd77fbc76-v9bt2"] Apr 16 19:30:59.871720 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:59.871686 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6q5z\" (UniqueName: \"kubernetes.io/projected/b82bf6e4-b659-444d-815d-ec2632766fc9-kube-api-access-s6q5z\") pod \"authorino-5dd77fbc76-v9bt2\" (UID: \"b82bf6e4-b659-444d-815d-ec2632766fc9\") " pod="kuadrant-system/authorino-5dd77fbc76-v9bt2" Apr 16 19:30:59.871881 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:59.871732 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/b82bf6e4-b659-444d-815d-ec2632766fc9-tls-cert\") pod \"authorino-5dd77fbc76-v9bt2\" (UID: \"b82bf6e4-b659-444d-815d-ec2632766fc9\") " pod="kuadrant-system/authorino-5dd77fbc76-v9bt2" Apr 16 19:30:59.874949 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:59.874927 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7695c5c68-tr8x9"] Apr 16 19:30:59.877449 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:30:59.877422 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc90c1148_408b_417a_a2ef_abd66d936ee6.slice/crio-99d7da7adaf2e84c2bffc0ee40d7dd664f3b0b324a89e5c934e5e930bb01c89a WatchSource:0}: Error finding container 99d7da7adaf2e84c2bffc0ee40d7dd664f3b0b324a89e5c934e5e930bb01c89a: Status 404 returned error can't find the container with id 99d7da7adaf2e84c2bffc0ee40d7dd664f3b0b324a89e5c934e5e930bb01c89a Apr 16 19:30:59.972507 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:59.972478 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/b82bf6e4-b659-444d-815d-ec2632766fc9-tls-cert\") pod \"authorino-5dd77fbc76-v9bt2\" (UID: \"b82bf6e4-b659-444d-815d-ec2632766fc9\") " pod="kuadrant-system/authorino-5dd77fbc76-v9bt2" Apr 16 19:30:59.972672 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:59.972586 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s6q5z\" (UniqueName: \"kubernetes.io/projected/b82bf6e4-b659-444d-815d-ec2632766fc9-kube-api-access-s6q5z\") pod \"authorino-5dd77fbc76-v9bt2\" (UID: \"b82bf6e4-b659-444d-815d-ec2632766fc9\") " pod="kuadrant-system/authorino-5dd77fbc76-v9bt2" Apr 16 19:30:59.975127 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:59.975104 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/b82bf6e4-b659-444d-815d-ec2632766fc9-tls-cert\") pod \"authorino-5dd77fbc76-v9bt2\" (UID: \"b82bf6e4-b659-444d-815d-ec2632766fc9\") " pod="kuadrant-system/authorino-5dd77fbc76-v9bt2" Apr 16 19:30:59.982388 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:30:59.982361 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6q5z\" (UniqueName: \"kubernetes.io/projected/b82bf6e4-b659-444d-815d-ec2632766fc9-kube-api-access-s6q5z\") pod \"authorino-5dd77fbc76-v9bt2\" (UID: \"b82bf6e4-b659-444d-815d-ec2632766fc9\") " pod="kuadrant-system/authorino-5dd77fbc76-v9bt2" Apr 16 19:31:00.060120 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:00.060021 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5dd77fbc76-v9bt2" Apr 16 19:31:00.311954 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:00.311657 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-5dd77fbc76-v9bt2"] Apr 16 19:31:00.317082 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:31:00.317043 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb82bf6e4_b659_444d_815d_ec2632766fc9.slice/crio-ffdd11ab21ed134ceb2ec4aecd3474bd0aa0b71fab42a7f9ad31ba64dffd96cb WatchSource:0}: Error finding container ffdd11ab21ed134ceb2ec4aecd3474bd0aa0b71fab42a7f9ad31ba64dffd96cb: Status 404 returned error can't find the container with id ffdd11ab21ed134ceb2ec4aecd3474bd0aa0b71fab42a7f9ad31ba64dffd96cb Apr 16 19:31:00.385619 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:00.385556 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7695c5c68-tr8x9" event={"ID":"c90c1148-408b-417a-a2ef-abd66d936ee6","Type":"ContainerStarted","Data":"99d7da7adaf2e84c2bffc0ee40d7dd664f3b0b324a89e5c934e5e930bb01c89a"} Apr 16 19:31:00.387540 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:00.387504 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-nv4sr" event={"ID":"0425eb24-ec48-4ffd-863d-51135aa5ce39","Type":"ContainerStarted","Data":"ade888a4b73406ac5bb4ec9f348780c048eebe31c6b13560c82258e0a4567bbc"} Apr 16 19:31:00.387698 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:00.387548 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-nv4sr" event={"ID":"0425eb24-ec48-4ffd-863d-51135aa5ce39","Type":"ContainerStarted","Data":"adff254a945685574859962d37c83ca7e2d36252dcb81500579be28ef9a95719"} Apr 16 19:31:00.387831 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:00.387806 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-nv4sr" podUID="0425eb24-ec48-4ffd-863d-51135aa5ce39" containerName="authorino" containerID="cri-o://ade888a4b73406ac5bb4ec9f348780c048eebe31c6b13560c82258e0a4567bbc" gracePeriod=30 Apr 16 19:31:00.391231 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:00.391163 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5dd77fbc76-v9bt2" event={"ID":"b82bf6e4-b659-444d-815d-ec2632766fc9","Type":"ContainerStarted","Data":"ffdd11ab21ed134ceb2ec4aecd3474bd0aa0b71fab42a7f9ad31ba64dffd96cb"} Apr 16 19:31:00.417988 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:00.417822 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-nv4sr" podStartSLOduration=2.024265953 podStartE2EDuration="2.417800964s" podCreationTimestamp="2026-04-16 19:30:58 +0000 UTC" firstStartedPulling="2026-04-16 19:30:59.517441585 +0000 UTC m=+767.004639395" lastFinishedPulling="2026-04-16 19:30:59.910976595 +0000 UTC m=+767.398174406" observedRunningTime="2026-04-16 19:31:00.415079366 +0000 UTC m=+767.902277212" watchObservedRunningTime="2026-04-16 19:31:00.417800964 +0000 UTC m=+767.904998797" Apr 16 19:31:00.730971 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:00.730856 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-nv4sr" Apr 16 19:31:00.885953 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:00.885811 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f92m7\" (UniqueName: \"kubernetes.io/projected/0425eb24-ec48-4ffd-863d-51135aa5ce39-kube-api-access-f92m7\") pod \"0425eb24-ec48-4ffd-863d-51135aa5ce39\" (UID: \"0425eb24-ec48-4ffd-863d-51135aa5ce39\") " Apr 16 19:31:00.888619 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:00.888570 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0425eb24-ec48-4ffd-863d-51135aa5ce39-kube-api-access-f92m7" (OuterVolumeSpecName: "kube-api-access-f92m7") pod "0425eb24-ec48-4ffd-863d-51135aa5ce39" (UID: "0425eb24-ec48-4ffd-863d-51135aa5ce39"). InnerVolumeSpecName "kube-api-access-f92m7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:31:00.986843 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:00.986802 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f92m7\" (UniqueName: \"kubernetes.io/projected/0425eb24-ec48-4ffd-863d-51135aa5ce39-kube-api-access-f92m7\") on node \"ip-10-0-130-83.ec2.internal\" DevicePath \"\"" Apr 16 19:31:01.396580 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:01.396537 2580 generic.go:358] "Generic (PLEG): container finished" podID="0425eb24-ec48-4ffd-863d-51135aa5ce39" containerID="ade888a4b73406ac5bb4ec9f348780c048eebe31c6b13560c82258e0a4567bbc" exitCode=0 Apr 16 19:31:01.396774 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:01.396633 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-nv4sr" Apr 16 19:31:01.396853 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:01.396798 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-nv4sr" event={"ID":"0425eb24-ec48-4ffd-863d-51135aa5ce39","Type":"ContainerDied","Data":"ade888a4b73406ac5bb4ec9f348780c048eebe31c6b13560c82258e0a4567bbc"} Apr 16 19:31:01.396915 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:01.396832 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-nv4sr" event={"ID":"0425eb24-ec48-4ffd-863d-51135aa5ce39","Type":"ContainerDied","Data":"adff254a945685574859962d37c83ca7e2d36252dcb81500579be28ef9a95719"} Apr 16 19:31:01.396915 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:01.396874 2580 scope.go:117] "RemoveContainer" containerID="ade888a4b73406ac5bb4ec9f348780c048eebe31c6b13560c82258e0a4567bbc" Apr 16 19:31:01.399672 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:01.399039 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5dd77fbc76-v9bt2" event={"ID":"b82bf6e4-b659-444d-815d-ec2632766fc9","Type":"ContainerStarted","Data":"9a8d8488785b89fbc7f2f33cd496bffab715c517e5d36bd9ee39771804a43ab1"} Apr 16 19:31:01.402326 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:01.402104 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7695c5c68-tr8x9" event={"ID":"c90c1148-408b-417a-a2ef-abd66d936ee6","Type":"ContainerStarted","Data":"739c85cf7bdcfd8df976b7644635f52785e12e1caa007cf8506c4da0d4f7ebb2"} Apr 16 19:31:01.402326 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:01.402305 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7695c5c68-tr8x9" podUID="c90c1148-408b-417a-a2ef-abd66d936ee6" containerName="authorino" containerID="cri-o://739c85cf7bdcfd8df976b7644635f52785e12e1caa007cf8506c4da0d4f7ebb2" gracePeriod=30 Apr 16 19:31:01.416592 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:01.416568 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-nv4sr"] Apr 16 19:31:01.422074 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:01.422043 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-nv4sr"] Apr 16 19:31:01.440675 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:01.440621 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-5dd77fbc76-v9bt2" podStartSLOduration=2.092201666 podStartE2EDuration="2.440604933s" podCreationTimestamp="2026-04-16 19:30:59 +0000 UTC" firstStartedPulling="2026-04-16 19:31:00.319357654 +0000 UTC m=+767.806555470" lastFinishedPulling="2026-04-16 19:31:00.667760922 +0000 UTC m=+768.154958737" observedRunningTime="2026-04-16 19:31:01.43916223 +0000 UTC m=+768.926360073" watchObservedRunningTime="2026-04-16 19:31:01.440604933 +0000 UTC m=+768.927802764" Apr 16 19:31:01.471068 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:01.471010 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7695c5c68-tr8x9" podStartSLOduration=2.053860348 podStartE2EDuration="2.470991982s" podCreationTimestamp="2026-04-16 19:30:59 +0000 UTC" firstStartedPulling="2026-04-16 19:30:59.878721372 +0000 UTC m=+767.365919183" lastFinishedPulling="2026-04-16 19:31:00.295853001 +0000 UTC m=+767.783050817" observedRunningTime="2026-04-16 19:31:01.466251715 +0000 UTC m=+768.953449548" watchObservedRunningTime="2026-04-16 19:31:01.470991982 +0000 UTC m=+768.958189814" Apr 16 19:31:01.492655 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:01.492265 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-jp8hr"] Apr 16 19:31:01.492655 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:01.492510 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-jp8hr" podUID="082eb313-d0d8-45e0-81df-67ca16b31c25" containerName="authorino" containerID="cri-o://4fa6326ec17fff566eaed2c33b0cb54e3503164fe74d1b2994d8ce88c0e4f773" gracePeriod=30 Apr 16 19:31:02.407467 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:02.407432 2580 generic.go:358] "Generic (PLEG): container finished" podID="c90c1148-408b-417a-a2ef-abd66d936ee6" containerID="739c85cf7bdcfd8df976b7644635f52785e12e1caa007cf8506c4da0d4f7ebb2" exitCode=0 Apr 16 19:31:02.408001 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:02.407527 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7695c5c68-tr8x9" event={"ID":"c90c1148-408b-417a-a2ef-abd66d936ee6","Type":"ContainerDied","Data":"739c85cf7bdcfd8df976b7644635f52785e12e1caa007cf8506c4da0d4f7ebb2"} Apr 16 19:31:02.614211 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:02.613937 2580 scope.go:117] "RemoveContainer" containerID="ade888a4b73406ac5bb4ec9f348780c048eebe31c6b13560c82258e0a4567bbc" Apr 16 19:31:02.614367 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:31:02.614344 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ade888a4b73406ac5bb4ec9f348780c048eebe31c6b13560c82258e0a4567bbc\": container with ID starting with ade888a4b73406ac5bb4ec9f348780c048eebe31c6b13560c82258e0a4567bbc not found: ID does not exist" containerID="ade888a4b73406ac5bb4ec9f348780c048eebe31c6b13560c82258e0a4567bbc" Apr 16 19:31:02.614481 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:02.614377 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ade888a4b73406ac5bb4ec9f348780c048eebe31c6b13560c82258e0a4567bbc"} err="failed to get container status \"ade888a4b73406ac5bb4ec9f348780c048eebe31c6b13560c82258e0a4567bbc\": rpc error: code = NotFound desc = could not find container \"ade888a4b73406ac5bb4ec9f348780c048eebe31c6b13560c82258e0a4567bbc\": container with ID starting with ade888a4b73406ac5bb4ec9f348780c048eebe31c6b13560c82258e0a4567bbc not found: ID does not exist" Apr 16 19:31:02.949831 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:02.949809 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7695c5c68-tr8x9" Apr 16 19:31:02.953509 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:02.953462 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-jp8hr" Apr 16 19:31:03.004863 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:03.004829 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2xsf\" (UniqueName: \"kubernetes.io/projected/c90c1148-408b-417a-a2ef-abd66d936ee6-kube-api-access-l2xsf\") pod \"c90c1148-408b-417a-a2ef-abd66d936ee6\" (UID: \"c90c1148-408b-417a-a2ef-abd66d936ee6\") " Apr 16 19:31:03.005028 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:03.004879 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n24kx\" (UniqueName: \"kubernetes.io/projected/082eb313-d0d8-45e0-81df-67ca16b31c25-kube-api-access-n24kx\") pod \"082eb313-d0d8-45e0-81df-67ca16b31c25\" (UID: \"082eb313-d0d8-45e0-81df-67ca16b31c25\") " Apr 16 19:31:03.007011 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:03.006976 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/082eb313-d0d8-45e0-81df-67ca16b31c25-kube-api-access-n24kx" (OuterVolumeSpecName: "kube-api-access-n24kx") pod "082eb313-d0d8-45e0-81df-67ca16b31c25" (UID: "082eb313-d0d8-45e0-81df-67ca16b31c25"). InnerVolumeSpecName "kube-api-access-n24kx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:31:03.007240 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:03.007214 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c90c1148-408b-417a-a2ef-abd66d936ee6-kube-api-access-l2xsf" (OuterVolumeSpecName: "kube-api-access-l2xsf") pod "c90c1148-408b-417a-a2ef-abd66d936ee6" (UID: "c90c1148-408b-417a-a2ef-abd66d936ee6"). InnerVolumeSpecName "kube-api-access-l2xsf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:31:03.087369 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:03.087337 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0425eb24-ec48-4ffd-863d-51135aa5ce39" path="/var/lib/kubelet/pods/0425eb24-ec48-4ffd-863d-51135aa5ce39/volumes" Apr 16 19:31:03.106386 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:03.106360 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l2xsf\" (UniqueName: \"kubernetes.io/projected/c90c1148-408b-417a-a2ef-abd66d936ee6-kube-api-access-l2xsf\") on node \"ip-10-0-130-83.ec2.internal\" DevicePath \"\"" Apr 16 19:31:03.106386 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:03.106385 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n24kx\" (UniqueName: \"kubernetes.io/projected/082eb313-d0d8-45e0-81df-67ca16b31c25-kube-api-access-n24kx\") on node \"ip-10-0-130-83.ec2.internal\" DevicePath \"\"" Apr 16 19:31:03.412247 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:03.412218 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7695c5c68-tr8x9" Apr 16 19:31:03.412686 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:03.412217 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7695c5c68-tr8x9" event={"ID":"c90c1148-408b-417a-a2ef-abd66d936ee6","Type":"ContainerDied","Data":"99d7da7adaf2e84c2bffc0ee40d7dd664f3b0b324a89e5c934e5e930bb01c89a"} Apr 16 19:31:03.412686 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:03.412348 2580 scope.go:117] "RemoveContainer" containerID="739c85cf7bdcfd8df976b7644635f52785e12e1caa007cf8506c4da0d4f7ebb2" Apr 16 19:31:03.413438 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:03.413415 2580 generic.go:358] "Generic (PLEG): container finished" podID="082eb313-d0d8-45e0-81df-67ca16b31c25" containerID="4fa6326ec17fff566eaed2c33b0cb54e3503164fe74d1b2994d8ce88c0e4f773" exitCode=0 Apr 16 19:31:03.413541 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:03.413458 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-jp8hr" Apr 16 19:31:03.413541 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:03.413469 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-jp8hr" event={"ID":"082eb313-d0d8-45e0-81df-67ca16b31c25","Type":"ContainerDied","Data":"4fa6326ec17fff566eaed2c33b0cb54e3503164fe74d1b2994d8ce88c0e4f773"} Apr 16 19:31:03.413541 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:03.413503 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-jp8hr" event={"ID":"082eb313-d0d8-45e0-81df-67ca16b31c25","Type":"ContainerDied","Data":"9140e2b885b22f2d032b7c773b79bcae72335ab6492277d92eaf67828b9a9f09"} Apr 16 19:31:03.415414 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:03.415395 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-868c997c74-gjwrm" event={"ID":"a73ff427-3c2f-49ce-8ff9-259f21021b2a","Type":"ContainerStarted","Data":"d49cc29cdd01c9e8389811168eaffced23484017347db4c0cc4191824e4986e7"} Apr 16 19:31:03.415533 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:03.415512 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-868c997c74-gjwrm" Apr 16 19:31:03.420585 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:03.420567 2580 scope.go:117] "RemoveContainer" containerID="4fa6326ec17fff566eaed2c33b0cb54e3503164fe74d1b2994d8ce88c0e4f773" Apr 16 19:31:03.427501 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:03.427482 2580 scope.go:117] "RemoveContainer" containerID="4fa6326ec17fff566eaed2c33b0cb54e3503164fe74d1b2994d8ce88c0e4f773" Apr 16 19:31:03.427741 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:31:03.427725 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fa6326ec17fff566eaed2c33b0cb54e3503164fe74d1b2994d8ce88c0e4f773\": container with ID starting with 4fa6326ec17fff566eaed2c33b0cb54e3503164fe74d1b2994d8ce88c0e4f773 not found: ID does not exist" containerID="4fa6326ec17fff566eaed2c33b0cb54e3503164fe74d1b2994d8ce88c0e4f773" Apr 16 19:31:03.427804 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:03.427746 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fa6326ec17fff566eaed2c33b0cb54e3503164fe74d1b2994d8ce88c0e4f773"} err="failed to get container status \"4fa6326ec17fff566eaed2c33b0cb54e3503164fe74d1b2994d8ce88c0e4f773\": rpc error: code = NotFound desc = could not find container \"4fa6326ec17fff566eaed2c33b0cb54e3503164fe74d1b2994d8ce88c0e4f773\": container with ID starting with 4fa6326ec17fff566eaed2c33b0cb54e3503164fe74d1b2994d8ce88c0e4f773 not found: ID does not exist" Apr 16 19:31:03.439100 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:03.439058 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-868c997c74-gjwrm" podStartSLOduration=1.8510076290000002 podStartE2EDuration="5.439045799s" podCreationTimestamp="2026-04-16 19:30:58 +0000 UTC" firstStartedPulling="2026-04-16 19:30:59.365171507 +0000 UTC m=+766.852369318" lastFinishedPulling="2026-04-16 19:31:02.953209663 +0000 UTC m=+770.440407488" observedRunningTime="2026-04-16 19:31:03.436838785 +0000 UTC m=+770.924036619" watchObservedRunningTime="2026-04-16 19:31:03.439045799 +0000 UTC m=+770.926243631" Apr 16 19:31:03.457442 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:03.457411 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7695c5c68-tr8x9"] Apr 16 19:31:03.461527 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:03.461504 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7695c5c68-tr8x9"] Apr 16 19:31:03.473572 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:03.473546 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-jp8hr"] Apr 16 19:31:03.478623 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:03.478591 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-jp8hr"] Apr 16 19:31:05.087175 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:05.087138 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="082eb313-d0d8-45e0-81df-67ca16b31c25" path="/var/lib/kubelet/pods/082eb313-d0d8-45e0-81df-67ca16b31c25/volumes" Apr 16 19:31:05.087611 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:05.087595 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c90c1148-408b-417a-a2ef-abd66d936ee6" path="/var/lib/kubelet/pods/c90c1148-408b-417a-a2ef-abd66d936ee6/volumes" Apr 16 19:31:08.600333 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:08.600297 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-868c997c74-gjwrm"] Apr 16 19:31:08.600820 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:08.600541 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-868c997c74-gjwrm" podUID="a73ff427-3c2f-49ce-8ff9-259f21021b2a" containerName="maas-api" containerID="cri-o://d49cc29cdd01c9e8389811168eaffced23484017347db4c0cc4191824e4986e7" gracePeriod=30 Apr 16 19:31:08.605768 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:08.605744 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-868c997c74-gjwrm" Apr 16 19:31:08.844672 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:08.844651 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-868c997c74-gjwrm" Apr 16 19:31:08.957678 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:08.957649 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkstj\" (UniqueName: \"kubernetes.io/projected/a73ff427-3c2f-49ce-8ff9-259f21021b2a-kube-api-access-qkstj\") pod \"a73ff427-3c2f-49ce-8ff9-259f21021b2a\" (UID: \"a73ff427-3c2f-49ce-8ff9-259f21021b2a\") " Apr 16 19:31:08.957876 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:08.957706 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/a73ff427-3c2f-49ce-8ff9-259f21021b2a-maas-api-tls\") pod \"a73ff427-3c2f-49ce-8ff9-259f21021b2a\" (UID: \"a73ff427-3c2f-49ce-8ff9-259f21021b2a\") " Apr 16 19:31:08.960017 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:08.959984 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a73ff427-3c2f-49ce-8ff9-259f21021b2a-kube-api-access-qkstj" (OuterVolumeSpecName: "kube-api-access-qkstj") pod "a73ff427-3c2f-49ce-8ff9-259f21021b2a" (UID: "a73ff427-3c2f-49ce-8ff9-259f21021b2a"). InnerVolumeSpecName "kube-api-access-qkstj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:31:08.960017 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:08.959997 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a73ff427-3c2f-49ce-8ff9-259f21021b2a-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "a73ff427-3c2f-49ce-8ff9-259f21021b2a" (UID: "a73ff427-3c2f-49ce-8ff9-259f21021b2a"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:31:09.058946 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:09.058913 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qkstj\" (UniqueName: \"kubernetes.io/projected/a73ff427-3c2f-49ce-8ff9-259f21021b2a-kube-api-access-qkstj\") on node \"ip-10-0-130-83.ec2.internal\" DevicePath \"\"" Apr 16 19:31:09.058946 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:09.058944 2580 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/a73ff427-3c2f-49ce-8ff9-259f21021b2a-maas-api-tls\") on node \"ip-10-0-130-83.ec2.internal\" DevicePath \"\"" Apr 16 19:31:09.442084 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:09.442053 2580 generic.go:358] "Generic (PLEG): container finished" podID="a73ff427-3c2f-49ce-8ff9-259f21021b2a" containerID="d49cc29cdd01c9e8389811168eaffced23484017347db4c0cc4191824e4986e7" exitCode=0 Apr 16 19:31:09.442269 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:09.442114 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-868c997c74-gjwrm" event={"ID":"a73ff427-3c2f-49ce-8ff9-259f21021b2a","Type":"ContainerDied","Data":"d49cc29cdd01c9e8389811168eaffced23484017347db4c0cc4191824e4986e7"} Apr 16 19:31:09.442269 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:09.442139 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-868c997c74-gjwrm" event={"ID":"a73ff427-3c2f-49ce-8ff9-259f21021b2a","Type":"ContainerDied","Data":"bc96bb19fd0a341ba603357abe5da2b15045b833d9b037186634fa1f5e36d45c"} Apr 16 19:31:09.442269 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:09.442153 2580 scope.go:117] "RemoveContainer" containerID="d49cc29cdd01c9e8389811168eaffced23484017347db4c0cc4191824e4986e7" Apr 16 19:31:09.442269 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:09.442116 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-868c997c74-gjwrm" Apr 16 19:31:09.449864 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:09.449841 2580 scope.go:117] "RemoveContainer" containerID="d49cc29cdd01c9e8389811168eaffced23484017347db4c0cc4191824e4986e7" Apr 16 19:31:09.450137 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:31:09.450112 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d49cc29cdd01c9e8389811168eaffced23484017347db4c0cc4191824e4986e7\": container with ID starting with d49cc29cdd01c9e8389811168eaffced23484017347db4c0cc4191824e4986e7 not found: ID does not exist" containerID="d49cc29cdd01c9e8389811168eaffced23484017347db4c0cc4191824e4986e7" Apr 16 19:31:09.450238 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:09.450150 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d49cc29cdd01c9e8389811168eaffced23484017347db4c0cc4191824e4986e7"} err="failed to get container status \"d49cc29cdd01c9e8389811168eaffced23484017347db4c0cc4191824e4986e7\": rpc error: code = NotFound desc = could not find container \"d49cc29cdd01c9e8389811168eaffced23484017347db4c0cc4191824e4986e7\": container with ID starting with d49cc29cdd01c9e8389811168eaffced23484017347db4c0cc4191824e4986e7 not found: ID does not exist" Apr 16 19:31:09.459117 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:09.459064 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-868c997c74-gjwrm"] Apr 16 19:31:09.460918 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:09.460897 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-868c997c74-gjwrm"] Apr 16 19:31:11.086374 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:11.086340 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a73ff427-3c2f-49ce-8ff9-259f21021b2a" path="/var/lib/kubelet/pods/a73ff427-3c2f-49ce-8ff9-259f21021b2a/volumes" Apr 16 19:31:12.841350 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:12.841319 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-66c6fd6db6-r696d"] Apr 16 19:31:12.841826 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:12.841739 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0425eb24-ec48-4ffd-863d-51135aa5ce39" containerName="authorino" Apr 16 19:31:12.841826 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:12.841757 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="0425eb24-ec48-4ffd-863d-51135aa5ce39" containerName="authorino" Apr 16 19:31:12.841826 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:12.841791 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a73ff427-3c2f-49ce-8ff9-259f21021b2a" containerName="maas-api" Apr 16 19:31:12.841826 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:12.841798 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="a73ff427-3c2f-49ce-8ff9-259f21021b2a" containerName="maas-api" Apr 16 19:31:12.841826 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:12.841814 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="082eb313-d0d8-45e0-81df-67ca16b31c25" containerName="authorino" Apr 16 19:31:12.841826 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:12.841821 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="082eb313-d0d8-45e0-81df-67ca16b31c25" containerName="authorino" Apr 16 19:31:12.842128 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:12.841830 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c90c1148-408b-417a-a2ef-abd66d936ee6" containerName="authorino" Apr 16 19:31:12.842128 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:12.841840 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c90c1148-408b-417a-a2ef-abd66d936ee6" containerName="authorino" Apr 16 19:31:12.842128 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:12.841912 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="c90c1148-408b-417a-a2ef-abd66d936ee6" containerName="authorino" Apr 16 19:31:12.842128 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:12.841923 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="0425eb24-ec48-4ffd-863d-51135aa5ce39" containerName="authorino" Apr 16 19:31:12.842128 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:12.841934 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="a73ff427-3c2f-49ce-8ff9-259f21021b2a" containerName="maas-api" Apr 16 19:31:12.842128 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:12.841945 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="082eb313-d0d8-45e0-81df-67ca16b31c25" containerName="authorino" Apr 16 19:31:12.846160 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:12.846137 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-66c6fd6db6-r696d" Apr 16 19:31:12.848618 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:12.848594 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-kq8gt\"" Apr 16 19:31:12.853103 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:12.853075 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-66c6fd6db6-r696d"] Apr 16 19:31:12.990665 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:12.990638 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsqvx\" (UniqueName: \"kubernetes.io/projected/119c9fd9-c6e0-496b-957a-0a9d1cca824b-kube-api-access-xsqvx\") pod \"maas-controller-66c6fd6db6-r696d\" (UID: \"119c9fd9-c6e0-496b-957a-0a9d1cca824b\") " pod="opendatahub/maas-controller-66c6fd6db6-r696d" Apr 16 19:31:13.091209 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:13.091162 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xsqvx\" (UniqueName: \"kubernetes.io/projected/119c9fd9-c6e0-496b-957a-0a9d1cca824b-kube-api-access-xsqvx\") pod \"maas-controller-66c6fd6db6-r696d\" (UID: \"119c9fd9-c6e0-496b-957a-0a9d1cca824b\") " pod="opendatahub/maas-controller-66c6fd6db6-r696d" Apr 16 19:31:13.100330 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:13.100265 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsqvx\" (UniqueName: \"kubernetes.io/projected/119c9fd9-c6e0-496b-957a-0a9d1cca824b-kube-api-access-xsqvx\") pod \"maas-controller-66c6fd6db6-r696d\" (UID: \"119c9fd9-c6e0-496b-957a-0a9d1cca824b\") " pod="opendatahub/maas-controller-66c6fd6db6-r696d" Apr 16 19:31:13.159946 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:13.159920 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-kq8gt\"" Apr 16 19:31:13.167258 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:13.167226 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-66c6fd6db6-r696d" Apr 16 19:31:13.285616 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:13.285584 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-66c6fd6db6-r696d"] Apr 16 19:31:13.289141 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:31:13.289113 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod119c9fd9_c6e0_496b_957a_0a9d1cca824b.slice/crio-d70a471257d29e0c5405821bdecbde8becccc3c8ae2a5f29dee2ffffb25a8a98 WatchSource:0}: Error finding container d70a471257d29e0c5405821bdecbde8becccc3c8ae2a5f29dee2ffffb25a8a98: Status 404 returned error can't find the container with id d70a471257d29e0c5405821bdecbde8becccc3c8ae2a5f29dee2ffffb25a8a98 Apr 16 19:31:13.290491 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:13.290471 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:31:13.461649 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:13.461614 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-66c6fd6db6-r696d" event={"ID":"119c9fd9-c6e0-496b-957a-0a9d1cca824b","Type":"ContainerStarted","Data":"d70a471257d29e0c5405821bdecbde8becccc3c8ae2a5f29dee2ffffb25a8a98"} Apr 16 19:31:15.469268 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:15.469237 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-66c6fd6db6-r696d" event={"ID":"119c9fd9-c6e0-496b-957a-0a9d1cca824b","Type":"ContainerStarted","Data":"f4de0656d9a208e03c544c439437d3551fa63cc3a3b0a56831c47e79a41688cc"} Apr 16 19:31:15.469641 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:15.469354 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-66c6fd6db6-r696d" Apr 16 19:31:15.489441 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:15.489390 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-66c6fd6db6-r696d" podStartSLOduration=2.064091982 podStartE2EDuration="3.489375234s" podCreationTimestamp="2026-04-16 19:31:12 +0000 UTC" firstStartedPulling="2026-04-16 19:31:13.290626673 +0000 UTC m=+780.777824483" lastFinishedPulling="2026-04-16 19:31:14.715909921 +0000 UTC m=+782.203107735" observedRunningTime="2026-04-16 19:31:15.487427183 +0000 UTC m=+782.974625014" watchObservedRunningTime="2026-04-16 19:31:15.489375234 +0000 UTC m=+782.976573066" Apr 16 19:31:26.479672 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:26.479576 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-66c6fd6db6-r696d" Apr 16 19:31:52.452771 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:52.452734 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-h965s"] Apr 16 19:31:52.460768 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:52.460744 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-h965s" Apr 16 19:31:52.464137 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:52.464114 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 16 19:31:52.465065 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:52.465044 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-94ng7\"" Apr 16 19:31:52.465264 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:52.465241 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 16 19:31:52.465388 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:52.465368 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 16 19:31:52.466702 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:52.466681 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-h965s"] Apr 16 19:31:52.523444 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:52.523405 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/be291068-3189-4f00-8940-1b5f358598da-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-h965s\" (UID: \"be291068-3189-4f00-8940-1b5f358598da\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-h965s" Apr 16 19:31:52.523444 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:52.523451 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/be291068-3189-4f00-8940-1b5f358598da-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-h965s\" (UID: \"be291068-3189-4f00-8940-1b5f358598da\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-h965s" Apr 16 19:31:52.523652 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:52.523469 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/be291068-3189-4f00-8940-1b5f358598da-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-h965s\" (UID: \"be291068-3189-4f00-8940-1b5f358598da\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-h965s" Apr 16 19:31:52.523652 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:52.523565 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/be291068-3189-4f00-8940-1b5f358598da-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-h965s\" (UID: \"be291068-3189-4f00-8940-1b5f358598da\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-h965s" Apr 16 19:31:52.523652 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:52.523630 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/be291068-3189-4f00-8940-1b5f358598da-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-h965s\" (UID: \"be291068-3189-4f00-8940-1b5f358598da\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-h965s" Apr 16 19:31:52.523749 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:52.523659 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkrt9\" (UniqueName: \"kubernetes.io/projected/be291068-3189-4f00-8940-1b5f358598da-kube-api-access-kkrt9\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-h965s\" (UID: \"be291068-3189-4f00-8940-1b5f358598da\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-h965s" Apr 16 19:31:52.624875 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:52.624835 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/be291068-3189-4f00-8940-1b5f358598da-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-h965s\" (UID: \"be291068-3189-4f00-8940-1b5f358598da\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-h965s" Apr 16 19:31:52.625051 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:52.624895 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/be291068-3189-4f00-8940-1b5f358598da-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-h965s\" (UID: \"be291068-3189-4f00-8940-1b5f358598da\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-h965s" Apr 16 19:31:52.625051 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:52.624922 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/be291068-3189-4f00-8940-1b5f358598da-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-h965s\" (UID: \"be291068-3189-4f00-8940-1b5f358598da\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-h965s" Apr 16 19:31:52.625051 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:52.624972 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/be291068-3189-4f00-8940-1b5f358598da-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-h965s\" (UID: \"be291068-3189-4f00-8940-1b5f358598da\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-h965s" Apr 16 19:31:52.625259 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:52.625063 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/be291068-3189-4f00-8940-1b5f358598da-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-h965s\" (UID: \"be291068-3189-4f00-8940-1b5f358598da\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-h965s" Apr 16 19:31:52.625372 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:52.625352 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/be291068-3189-4f00-8940-1b5f358598da-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-h965s\" (UID: \"be291068-3189-4f00-8940-1b5f358598da\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-h965s" Apr 16 19:31:52.625456 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:52.625432 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/be291068-3189-4f00-8940-1b5f358598da-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-h965s\" (UID: \"be291068-3189-4f00-8940-1b5f358598da\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-h965s" Apr 16 19:31:52.625622 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:52.625597 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kkrt9\" (UniqueName: \"kubernetes.io/projected/be291068-3189-4f00-8940-1b5f358598da-kube-api-access-kkrt9\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-h965s\" (UID: \"be291068-3189-4f00-8940-1b5f358598da\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-h965s" Apr 16 19:31:52.625917 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:52.625896 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/be291068-3189-4f00-8940-1b5f358598da-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-h965s\" (UID: \"be291068-3189-4f00-8940-1b5f358598da\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-h965s" Apr 16 19:31:52.627997 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:52.627966 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/be291068-3189-4f00-8940-1b5f358598da-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-h965s\" (UID: \"be291068-3189-4f00-8940-1b5f358598da\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-h965s" Apr 16 19:31:52.628282 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:52.628261 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/be291068-3189-4f00-8940-1b5f358598da-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-h965s\" (UID: \"be291068-3189-4f00-8940-1b5f358598da\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-h965s" Apr 16 19:31:52.639277 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:52.639250 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkrt9\" (UniqueName: \"kubernetes.io/projected/be291068-3189-4f00-8940-1b5f358598da-kube-api-access-kkrt9\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-h965s\" (UID: \"be291068-3189-4f00-8940-1b5f358598da\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-h965s" Apr 16 19:31:52.774617 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:52.772206 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-h965s" Apr 16 19:31:52.917637 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:52.917600 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-h965s"] Apr 16 19:31:52.922550 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:31:52.922516 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe291068_3189_4f00_8940_1b5f358598da.slice/crio-baae46b61c3a41c396aade7f4fa829b8dcb6988a8c43ede0b9fca05cd2096dea WatchSource:0}: Error finding container baae46b61c3a41c396aade7f4fa829b8dcb6988a8c43ede0b9fca05cd2096dea: Status 404 returned error can't find the container with id baae46b61c3a41c396aade7f4fa829b8dcb6988a8c43ede0b9fca05cd2096dea Apr 16 19:31:53.591168 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:53.591134 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-h965s" event={"ID":"be291068-3189-4f00-8940-1b5f358598da","Type":"ContainerStarted","Data":"baae46b61c3a41c396aade7f4fa829b8dcb6988a8c43ede0b9fca05cd2096dea"} Apr 16 19:31:59.614236 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:31:59.614175 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-h965s" event={"ID":"be291068-3189-4f00-8940-1b5f358598da","Type":"ContainerStarted","Data":"fa6b8b30f7ef1e8cc6d47c340ad45fc20499ad3707f9b4010f2126ae5f28468f"} Apr 16 19:32:04.631104 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:04.631070 2580 generic.go:358] "Generic (PLEG): container finished" podID="be291068-3189-4f00-8940-1b5f358598da" containerID="fa6b8b30f7ef1e8cc6d47c340ad45fc20499ad3707f9b4010f2126ae5f28468f" exitCode=0 Apr 16 19:32:04.631495 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:04.631143 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-h965s" event={"ID":"be291068-3189-4f00-8940-1b5f358598da","Type":"ContainerDied","Data":"fa6b8b30f7ef1e8cc6d47c340ad45fc20499ad3707f9b4010f2126ae5f28468f"} Apr 16 19:32:08.654327 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:08.654236 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-h965s" event={"ID":"be291068-3189-4f00-8940-1b5f358598da","Type":"ContainerStarted","Data":"1a35814e74f9db6a6424971c794737342113b97587c6d080bd521a1f47b588d4"} Apr 16 19:32:08.654695 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:08.654503 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-h965s" Apr 16 19:32:08.679470 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:08.679417 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-h965s" podStartSLOduration=1.206342572 podStartE2EDuration="16.679402893s" podCreationTimestamp="2026-04-16 19:31:52 +0000 UTC" firstStartedPulling="2026-04-16 19:31:52.924630704 +0000 UTC m=+820.411828514" lastFinishedPulling="2026-04-16 19:32:08.39769102 +0000 UTC m=+835.884888835" observedRunningTime="2026-04-16 19:32:08.679217282 +0000 UTC m=+836.166415115" watchObservedRunningTime="2026-04-16 19:32:08.679402893 +0000 UTC m=+836.166600725" Apr 16 19:32:19.670306 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:19.670277 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-h965s" Apr 16 19:32:34.581204 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:34.581159 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw"] Apr 16 19:32:34.584203 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:34.584170 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw" Apr 16 19:32:34.587454 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:34.587432 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 16 19:32:34.605285 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:34.605257 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw"] Apr 16 19:32:34.697315 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:34.697281 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z46k6\" (UniqueName: \"kubernetes.io/projected/d55381cc-e604-4e60-a349-4aff29642410-kube-api-access-z46k6\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw\" (UID: \"d55381cc-e604-4e60-a349-4aff29642410\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw" Apr 16 19:32:34.697543 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:34.697345 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d55381cc-e604-4e60-a349-4aff29642410-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw\" (UID: \"d55381cc-e604-4e60-a349-4aff29642410\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw" Apr 16 19:32:34.697543 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:34.697423 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d55381cc-e604-4e60-a349-4aff29642410-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw\" (UID: \"d55381cc-e604-4e60-a349-4aff29642410\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw" Apr 16 19:32:34.697543 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:34.697466 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d55381cc-e604-4e60-a349-4aff29642410-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw\" (UID: \"d55381cc-e604-4e60-a349-4aff29642410\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw" Apr 16 19:32:34.697543 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:34.697523 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d55381cc-e604-4e60-a349-4aff29642410-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw\" (UID: \"d55381cc-e604-4e60-a349-4aff29642410\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw" Apr 16 19:32:34.697716 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:34.697571 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d55381cc-e604-4e60-a349-4aff29642410-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw\" (UID: \"d55381cc-e604-4e60-a349-4aff29642410\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw" Apr 16 19:32:34.798811 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:34.798774 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d55381cc-e604-4e60-a349-4aff29642410-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw\" (UID: \"d55381cc-e604-4e60-a349-4aff29642410\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw" Apr 16 19:32:34.798811 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:34.798816 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d55381cc-e604-4e60-a349-4aff29642410-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw\" (UID: \"d55381cc-e604-4e60-a349-4aff29642410\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw" Apr 16 19:32:34.799051 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:34.798835 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d55381cc-e604-4e60-a349-4aff29642410-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw\" (UID: \"d55381cc-e604-4e60-a349-4aff29642410\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw" Apr 16 19:32:34.799051 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:34.798858 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d55381cc-e604-4e60-a349-4aff29642410-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw\" (UID: \"d55381cc-e604-4e60-a349-4aff29642410\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw" Apr 16 19:32:34.799051 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:34.798899 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z46k6\" (UniqueName: \"kubernetes.io/projected/d55381cc-e604-4e60-a349-4aff29642410-kube-api-access-z46k6\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw\" (UID: \"d55381cc-e604-4e60-a349-4aff29642410\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw" Apr 16 19:32:34.799051 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:34.798964 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d55381cc-e604-4e60-a349-4aff29642410-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw\" (UID: \"d55381cc-e604-4e60-a349-4aff29642410\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw" Apr 16 19:32:34.799280 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:34.799247 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d55381cc-e604-4e60-a349-4aff29642410-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw\" (UID: \"d55381cc-e604-4e60-a349-4aff29642410\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw" Apr 16 19:32:34.799280 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:34.799270 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d55381cc-e604-4e60-a349-4aff29642410-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw\" (UID: \"d55381cc-e604-4e60-a349-4aff29642410\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw" Apr 16 19:32:34.799401 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:34.799384 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d55381cc-e604-4e60-a349-4aff29642410-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw\" (UID: \"d55381cc-e604-4e60-a349-4aff29642410\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw" Apr 16 19:32:34.801117 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:34.801092 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d55381cc-e604-4e60-a349-4aff29642410-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw\" (UID: \"d55381cc-e604-4e60-a349-4aff29642410\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw" Apr 16 19:32:34.801434 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:34.801416 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d55381cc-e604-4e60-a349-4aff29642410-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw\" (UID: \"d55381cc-e604-4e60-a349-4aff29642410\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw" Apr 16 19:32:34.807721 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:34.807697 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z46k6\" (UniqueName: \"kubernetes.io/projected/d55381cc-e604-4e60-a349-4aff29642410-kube-api-access-z46k6\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw\" (UID: \"d55381cc-e604-4e60-a349-4aff29642410\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw" Apr 16 19:32:34.893461 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:34.893377 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw" Apr 16 19:32:35.022988 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:35.022959 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw"] Apr 16 19:32:35.025536 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:32:35.025509 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd55381cc_e604_4e60_a349_4aff29642410.slice/crio-46c9d7c558f1c979402ae4b71d50299dc250bd335da00fdbe214cfa9c8c85f98 WatchSource:0}: Error finding container 46c9d7c558f1c979402ae4b71d50299dc250bd335da00fdbe214cfa9c8c85f98: Status 404 returned error can't find the container with id 46c9d7c558f1c979402ae4b71d50299dc250bd335da00fdbe214cfa9c8c85f98 Apr 16 19:32:35.752259 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:35.752216 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw" event={"ID":"d55381cc-e604-4e60-a349-4aff29642410","Type":"ContainerStarted","Data":"f479e856e58435faadc32f01e648ba8f78164e40fa5ff63c226b52ef40552de4"} Apr 16 19:32:35.752659 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:35.752268 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw" event={"ID":"d55381cc-e604-4e60-a349-4aff29642410","Type":"ContainerStarted","Data":"46c9d7c558f1c979402ae4b71d50299dc250bd335da00fdbe214cfa9c8c85f98"} Apr 16 19:32:40.774132 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:40.774099 2580 generic.go:358] "Generic (PLEG): container finished" podID="d55381cc-e604-4e60-a349-4aff29642410" containerID="f479e856e58435faadc32f01e648ba8f78164e40fa5ff63c226b52ef40552de4" exitCode=0 Apr 16 19:32:40.774531 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:40.774179 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw" event={"ID":"d55381cc-e604-4e60-a349-4aff29642410","Type":"ContainerDied","Data":"f479e856e58435faadc32f01e648ba8f78164e40fa5ff63c226b52ef40552de4"} Apr 16 19:32:41.779248 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:41.779210 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw" event={"ID":"d55381cc-e604-4e60-a349-4aff29642410","Type":"ContainerStarted","Data":"ef74af4b4911aca8acab0b7e252505b7316a0a5e2ae4aead3a3432fa9ce61d4e"} Apr 16 19:32:41.779642 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:41.779425 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw" Apr 16 19:32:41.800702 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:41.800649 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw" podStartSLOduration=7.597306268 podStartE2EDuration="7.800630618s" podCreationTimestamp="2026-04-16 19:32:34 +0000 UTC" firstStartedPulling="2026-04-16 19:32:40.774795413 +0000 UTC m=+868.261993223" lastFinishedPulling="2026-04-16 19:32:40.978119751 +0000 UTC m=+868.465317573" observedRunningTime="2026-04-16 19:32:41.799623892 +0000 UTC m=+869.286821726" watchObservedRunningTime="2026-04-16 19:32:41.800630618 +0000 UTC m=+869.287828451" Apr 16 19:32:44.647334 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:44.647297 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4"] Apr 16 19:32:44.650815 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:44.650796 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4" Apr 16 19:32:44.653508 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:44.653488 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 16 19:32:44.663242 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:44.663218 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4"] Apr 16 19:32:44.787239 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:44.787180 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/94269a68-4661-4d96-a9aa-00bff162ad5a-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4\" (UID: \"94269a68-4661-4d96-a9aa-00bff162ad5a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4" Apr 16 19:32:44.787411 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:44.787244 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/94269a68-4661-4d96-a9aa-00bff162ad5a-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4\" (UID: \"94269a68-4661-4d96-a9aa-00bff162ad5a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4" Apr 16 19:32:44.787411 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:44.787277 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzdnp\" (UniqueName: \"kubernetes.io/projected/94269a68-4661-4d96-a9aa-00bff162ad5a-kube-api-access-dzdnp\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4\" (UID: \"94269a68-4661-4d96-a9aa-00bff162ad5a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4" Apr 16 19:32:44.787411 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:44.787328 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/94269a68-4661-4d96-a9aa-00bff162ad5a-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4\" (UID: \"94269a68-4661-4d96-a9aa-00bff162ad5a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4" Apr 16 19:32:44.787411 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:44.787356 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/94269a68-4661-4d96-a9aa-00bff162ad5a-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4\" (UID: \"94269a68-4661-4d96-a9aa-00bff162ad5a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4" Apr 16 19:32:44.787411 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:44.787402 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/94269a68-4661-4d96-a9aa-00bff162ad5a-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4\" (UID: \"94269a68-4661-4d96-a9aa-00bff162ad5a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4" Apr 16 19:32:44.888496 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:44.888459 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/94269a68-4661-4d96-a9aa-00bff162ad5a-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4\" (UID: \"94269a68-4661-4d96-a9aa-00bff162ad5a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4" Apr 16 19:32:44.888680 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:44.888531 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/94269a68-4661-4d96-a9aa-00bff162ad5a-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4\" (UID: \"94269a68-4661-4d96-a9aa-00bff162ad5a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4" Apr 16 19:32:44.888680 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:44.888570 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/94269a68-4661-4d96-a9aa-00bff162ad5a-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4\" (UID: \"94269a68-4661-4d96-a9aa-00bff162ad5a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4" Apr 16 19:32:44.888680 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:44.888608 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dzdnp\" (UniqueName: \"kubernetes.io/projected/94269a68-4661-4d96-a9aa-00bff162ad5a-kube-api-access-dzdnp\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4\" (UID: \"94269a68-4661-4d96-a9aa-00bff162ad5a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4" Apr 16 19:32:44.888680 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:44.888648 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/94269a68-4661-4d96-a9aa-00bff162ad5a-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4\" (UID: \"94269a68-4661-4d96-a9aa-00bff162ad5a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4" Apr 16 19:32:44.888680 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:44.888674 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/94269a68-4661-4d96-a9aa-00bff162ad5a-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4\" (UID: \"94269a68-4661-4d96-a9aa-00bff162ad5a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4" Apr 16 19:32:44.888996 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:44.888965 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/94269a68-4661-4d96-a9aa-00bff162ad5a-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4\" (UID: \"94269a68-4661-4d96-a9aa-00bff162ad5a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4" Apr 16 19:32:44.889104 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:44.889084 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/94269a68-4661-4d96-a9aa-00bff162ad5a-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4\" (UID: \"94269a68-4661-4d96-a9aa-00bff162ad5a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4" Apr 16 19:32:44.889150 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:44.889137 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/94269a68-4661-4d96-a9aa-00bff162ad5a-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4\" (UID: \"94269a68-4661-4d96-a9aa-00bff162ad5a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4" Apr 16 19:32:44.890858 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:44.890837 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/94269a68-4661-4d96-a9aa-00bff162ad5a-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4\" (UID: \"94269a68-4661-4d96-a9aa-00bff162ad5a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4" Apr 16 19:32:44.891155 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:44.891138 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/94269a68-4661-4d96-a9aa-00bff162ad5a-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4\" (UID: \"94269a68-4661-4d96-a9aa-00bff162ad5a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4" Apr 16 19:32:44.908993 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:44.908936 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzdnp\" (UniqueName: \"kubernetes.io/projected/94269a68-4661-4d96-a9aa-00bff162ad5a-kube-api-access-dzdnp\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4\" (UID: \"94269a68-4661-4d96-a9aa-00bff162ad5a\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4" Apr 16 19:32:44.960199 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:44.960165 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4" Apr 16 19:32:45.086895 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:32:45.086428 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94269a68_4661_4d96_a9aa_00bff162ad5a.slice/crio-d2d22280b71567ba6532d191b3d99f4260e521bb06f635102617fab802da834a WatchSource:0}: Error finding container d2d22280b71567ba6532d191b3d99f4260e521bb06f635102617fab802da834a: Status 404 returned error can't find the container with id d2d22280b71567ba6532d191b3d99f4260e521bb06f635102617fab802da834a Apr 16 19:32:45.091044 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:45.091025 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4"] Apr 16 19:32:45.793087 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:45.793046 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4" event={"ID":"94269a68-4661-4d96-a9aa-00bff162ad5a","Type":"ContainerStarted","Data":"83feb4d188144b33ea54c14e4fdecef0290c3c7fd9dbc76c784bff91cd8e15e4"} Apr 16 19:32:45.793087 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:45.793080 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4" event={"ID":"94269a68-4661-4d96-a9aa-00bff162ad5a","Type":"ContainerStarted","Data":"d2d22280b71567ba6532d191b3d99f4260e521bb06f635102617fab802da834a"} Apr 16 19:32:52.795761 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:52.795683 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw" Apr 16 19:32:53.822544 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:53.822514 2580 generic.go:358] "Generic (PLEG): container finished" podID="94269a68-4661-4d96-a9aa-00bff162ad5a" containerID="83feb4d188144b33ea54c14e4fdecef0290c3c7fd9dbc76c784bff91cd8e15e4" exitCode=0 Apr 16 19:32:53.822921 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:53.822553 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4" event={"ID":"94269a68-4661-4d96-a9aa-00bff162ad5a","Type":"ContainerDied","Data":"83feb4d188144b33ea54c14e4fdecef0290c3c7fd9dbc76c784bff91cd8e15e4"} Apr 16 19:32:54.827520 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:54.827487 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4" event={"ID":"94269a68-4661-4d96-a9aa-00bff162ad5a","Type":"ContainerStarted","Data":"c8f526eaee88503f63c03698b16372b4c197b8a59feefaf1f4de6a4007a0dec7"} Apr 16 19:32:54.827899 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:32:54.827685 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4" Apr 16 19:33:05.844233 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:33:05.844177 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4" Apr 16 19:33:05.890471 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:33:05.890418 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4" podStartSLOduration=21.659240269 podStartE2EDuration="21.890399663s" podCreationTimestamp="2026-04-16 19:32:44 +0000 UTC" firstStartedPulling="2026-04-16 19:32:53.823167204 +0000 UTC m=+881.310365017" lastFinishedPulling="2026-04-16 19:32:54.0543266 +0000 UTC m=+881.541524411" observedRunningTime="2026-04-16 19:32:54.870402895 +0000 UTC m=+882.357600728" watchObservedRunningTime="2026-04-16 19:33:05.890399663 +0000 UTC m=+893.377597495" Apr 16 19:33:13.022142 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:33:13.022111 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q4qdv_0244c6e9-6611-4147-8e78-0345faffa52e/ovn-acl-logging/0.log" Apr 16 19:33:13.024123 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:33:13.024102 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q4qdv_0244c6e9-6611-4147-8e78-0345faffa52e/ovn-acl-logging/0.log" Apr 16 19:33:23.101066 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:33:23.101037 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-775bb8958f-7f65r"] Apr 16 19:33:23.106095 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:33:23.106076 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-775bb8958f-7f65r" Apr 16 19:33:23.111521 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:33:23.111491 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-775bb8958f-7f65r"] Apr 16 19:33:23.123878 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:33:23.123844 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tffb2\" (UniqueName: \"kubernetes.io/projected/92039fd9-eda7-455f-940e-57a5ffe3cba0-kube-api-access-tffb2\") pod \"authorino-775bb8958f-7f65r\" (UID: \"92039fd9-eda7-455f-940e-57a5ffe3cba0\") " pod="kuadrant-system/authorino-775bb8958f-7f65r" Apr 16 19:33:23.124055 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:33:23.123976 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/92039fd9-eda7-455f-940e-57a5ffe3cba0-tls-cert\") pod \"authorino-775bb8958f-7f65r\" (UID: \"92039fd9-eda7-455f-940e-57a5ffe3cba0\") " pod="kuadrant-system/authorino-775bb8958f-7f65r" Apr 16 19:33:23.224514 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:33:23.224478 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tffb2\" (UniqueName: \"kubernetes.io/projected/92039fd9-eda7-455f-940e-57a5ffe3cba0-kube-api-access-tffb2\") pod \"authorino-775bb8958f-7f65r\" (UID: \"92039fd9-eda7-455f-940e-57a5ffe3cba0\") " pod="kuadrant-system/authorino-775bb8958f-7f65r" Apr 16 19:33:23.224709 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:33:23.224535 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/92039fd9-eda7-455f-940e-57a5ffe3cba0-tls-cert\") pod \"authorino-775bb8958f-7f65r\" (UID: \"92039fd9-eda7-455f-940e-57a5ffe3cba0\") " pod="kuadrant-system/authorino-775bb8958f-7f65r" Apr 16 19:33:23.227021 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:33:23.226998 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/92039fd9-eda7-455f-940e-57a5ffe3cba0-tls-cert\") pod \"authorino-775bb8958f-7f65r\" (UID: \"92039fd9-eda7-455f-940e-57a5ffe3cba0\") " pod="kuadrant-system/authorino-775bb8958f-7f65r" Apr 16 19:33:23.236783 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:33:23.236753 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tffb2\" (UniqueName: \"kubernetes.io/projected/92039fd9-eda7-455f-940e-57a5ffe3cba0-kube-api-access-tffb2\") pod \"authorino-775bb8958f-7f65r\" (UID: \"92039fd9-eda7-455f-940e-57a5ffe3cba0\") " pod="kuadrant-system/authorino-775bb8958f-7f65r" Apr 16 19:33:23.416596 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:33:23.416502 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-775bb8958f-7f65r" Apr 16 19:33:23.555471 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:33:23.555432 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-775bb8958f-7f65r"] Apr 16 19:33:23.558781 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:33:23.558752 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92039fd9_eda7_455f_940e_57a5ffe3cba0.slice/crio-6df7e6600a3cb45a7965b923d60c299697d10b26adf8625b657d6d85cbaa8bc9 WatchSource:0}: Error finding container 6df7e6600a3cb45a7965b923d60c299697d10b26adf8625b657d6d85cbaa8bc9: Status 404 returned error can't find the container with id 6df7e6600a3cb45a7965b923d60c299697d10b26adf8625b657d6d85cbaa8bc9 Apr 16 19:33:23.935596 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:33:23.935534 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-775bb8958f-7f65r" event={"ID":"92039fd9-eda7-455f-940e-57a5ffe3cba0","Type":"ContainerStarted","Data":"6df7e6600a3cb45a7965b923d60c299697d10b26adf8625b657d6d85cbaa8bc9"} Apr 16 19:33:24.940459 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:33:24.940425 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-775bb8958f-7f65r" event={"ID":"92039fd9-eda7-455f-940e-57a5ffe3cba0","Type":"ContainerStarted","Data":"cfd910345ac556b2cbf8c6dce57034cd9a739a6de801cd8fba60b17a4948613b"} Apr 16 19:33:24.960606 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:33:24.960546 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-775bb8958f-7f65r" podStartSLOduration=1.47288292 podStartE2EDuration="1.960529671s" podCreationTimestamp="2026-04-16 19:33:23 +0000 UTC" firstStartedPulling="2026-04-16 19:33:23.560139715 +0000 UTC m=+911.047337538" lastFinishedPulling="2026-04-16 19:33:24.047786479 +0000 UTC m=+911.534984289" observedRunningTime="2026-04-16 19:33:24.958048479 +0000 UTC m=+912.445246312" watchObservedRunningTime="2026-04-16 19:33:24.960529671 +0000 UTC m=+912.447727502" Apr 16 19:33:24.989209 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:33:24.989164 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-5dd77fbc76-v9bt2"] Apr 16 19:33:24.989441 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:33:24.989389 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-5dd77fbc76-v9bt2" podUID="b82bf6e4-b659-444d-815d-ec2632766fc9" containerName="authorino" containerID="cri-o://9a8d8488785b89fbc7f2f33cd496bffab715c517e5d36bd9ee39771804a43ab1" gracePeriod=30 Apr 16 19:33:25.229715 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:33:25.229688 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5dd77fbc76-v9bt2" Apr 16 19:33:25.343026 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:33:25.342988 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6q5z\" (UniqueName: \"kubernetes.io/projected/b82bf6e4-b659-444d-815d-ec2632766fc9-kube-api-access-s6q5z\") pod \"b82bf6e4-b659-444d-815d-ec2632766fc9\" (UID: \"b82bf6e4-b659-444d-815d-ec2632766fc9\") " Apr 16 19:33:25.343185 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:33:25.343053 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/b82bf6e4-b659-444d-815d-ec2632766fc9-tls-cert\") pod \"b82bf6e4-b659-444d-815d-ec2632766fc9\" (UID: \"b82bf6e4-b659-444d-815d-ec2632766fc9\") " Apr 16 19:33:25.345295 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:33:25.345255 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b82bf6e4-b659-444d-815d-ec2632766fc9-kube-api-access-s6q5z" (OuterVolumeSpecName: "kube-api-access-s6q5z") pod "b82bf6e4-b659-444d-815d-ec2632766fc9" (UID: "b82bf6e4-b659-444d-815d-ec2632766fc9"). InnerVolumeSpecName "kube-api-access-s6q5z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:33:25.354454 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:33:25.354410 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b82bf6e4-b659-444d-815d-ec2632766fc9-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "b82bf6e4-b659-444d-815d-ec2632766fc9" (UID: "b82bf6e4-b659-444d-815d-ec2632766fc9"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:33:25.444258 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:33:25.444221 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s6q5z\" (UniqueName: \"kubernetes.io/projected/b82bf6e4-b659-444d-815d-ec2632766fc9-kube-api-access-s6q5z\") on node \"ip-10-0-130-83.ec2.internal\" DevicePath \"\"" Apr 16 19:33:25.444258 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:33:25.444250 2580 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/b82bf6e4-b659-444d-815d-ec2632766fc9-tls-cert\") on node \"ip-10-0-130-83.ec2.internal\" DevicePath \"\"" Apr 16 19:33:25.944357 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:33:25.944320 2580 generic.go:358] "Generic (PLEG): container finished" podID="b82bf6e4-b659-444d-815d-ec2632766fc9" containerID="9a8d8488785b89fbc7f2f33cd496bffab715c517e5d36bd9ee39771804a43ab1" exitCode=0 Apr 16 19:33:25.944801 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:33:25.944371 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5dd77fbc76-v9bt2" Apr 16 19:33:25.944801 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:33:25.944403 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5dd77fbc76-v9bt2" event={"ID":"b82bf6e4-b659-444d-815d-ec2632766fc9","Type":"ContainerDied","Data":"9a8d8488785b89fbc7f2f33cd496bffab715c517e5d36bd9ee39771804a43ab1"} Apr 16 19:33:25.944801 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:33:25.944440 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5dd77fbc76-v9bt2" event={"ID":"b82bf6e4-b659-444d-815d-ec2632766fc9","Type":"ContainerDied","Data":"ffdd11ab21ed134ceb2ec4aecd3474bd0aa0b71fab42a7f9ad31ba64dffd96cb"} Apr 16 19:33:25.944801 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:33:25.944455 2580 scope.go:117] "RemoveContainer" containerID="9a8d8488785b89fbc7f2f33cd496bffab715c517e5d36bd9ee39771804a43ab1" Apr 16 19:33:25.952966 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:33:25.952949 2580 scope.go:117] "RemoveContainer" containerID="9a8d8488785b89fbc7f2f33cd496bffab715c517e5d36bd9ee39771804a43ab1" Apr 16 19:33:25.953246 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:33:25.953227 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a8d8488785b89fbc7f2f33cd496bffab715c517e5d36bd9ee39771804a43ab1\": container with ID starting with 9a8d8488785b89fbc7f2f33cd496bffab715c517e5d36bd9ee39771804a43ab1 not found: ID does not exist" containerID="9a8d8488785b89fbc7f2f33cd496bffab715c517e5d36bd9ee39771804a43ab1" Apr 16 19:33:25.953330 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:33:25.953256 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a8d8488785b89fbc7f2f33cd496bffab715c517e5d36bd9ee39771804a43ab1"} err="failed to get container status \"9a8d8488785b89fbc7f2f33cd496bffab715c517e5d36bd9ee39771804a43ab1\": rpc error: code = NotFound desc = could not find container \"9a8d8488785b89fbc7f2f33cd496bffab715c517e5d36bd9ee39771804a43ab1\": container with ID starting with 9a8d8488785b89fbc7f2f33cd496bffab715c517e5d36bd9ee39771804a43ab1 not found: ID does not exist" Apr 16 19:33:25.968799 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:33:25.968766 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-5dd77fbc76-v9bt2"] Apr 16 19:33:25.972114 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:33:25.972094 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-5dd77fbc76-v9bt2"] Apr 16 19:33:27.087520 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:33:27.087482 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b82bf6e4-b659-444d-815d-ec2632766fc9" path="/var/lib/kubelet/pods/b82bf6e4-b659-444d-815d-ec2632766fc9/volumes" Apr 16 19:34:46.111349 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:34:46.111268 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-66c6fd6db6-r696d"] Apr 16 19:34:46.111856 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:34:46.111527 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-66c6fd6db6-r696d" podUID="119c9fd9-c6e0-496b-957a-0a9d1cca824b" containerName="manager" containerID="cri-o://f4de0656d9a208e03c544c439437d3551fa63cc3a3b0a56831c47e79a41688cc" gracePeriod=10 Apr 16 19:34:46.350085 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:34:46.350062 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-66c6fd6db6-r696d" Apr 16 19:34:46.441984 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:34:46.441949 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsqvx\" (UniqueName: \"kubernetes.io/projected/119c9fd9-c6e0-496b-957a-0a9d1cca824b-kube-api-access-xsqvx\") pod \"119c9fd9-c6e0-496b-957a-0a9d1cca824b\" (UID: \"119c9fd9-c6e0-496b-957a-0a9d1cca824b\") " Apr 16 19:34:46.444028 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:34:46.443991 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/119c9fd9-c6e0-496b-957a-0a9d1cca824b-kube-api-access-xsqvx" (OuterVolumeSpecName: "kube-api-access-xsqvx") pod "119c9fd9-c6e0-496b-957a-0a9d1cca824b" (UID: "119c9fd9-c6e0-496b-957a-0a9d1cca824b"). InnerVolumeSpecName "kube-api-access-xsqvx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:34:46.543202 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:34:46.543164 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xsqvx\" (UniqueName: \"kubernetes.io/projected/119c9fd9-c6e0-496b-957a-0a9d1cca824b-kube-api-access-xsqvx\") on node \"ip-10-0-130-83.ec2.internal\" DevicePath \"\"" Apr 16 19:34:47.215972 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:34:47.215938 2580 generic.go:358] "Generic (PLEG): container finished" podID="119c9fd9-c6e0-496b-957a-0a9d1cca824b" containerID="f4de0656d9a208e03c544c439437d3551fa63cc3a3b0a56831c47e79a41688cc" exitCode=0 Apr 16 19:34:47.216414 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:34:47.215994 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-66c6fd6db6-r696d" event={"ID":"119c9fd9-c6e0-496b-957a-0a9d1cca824b","Type":"ContainerDied","Data":"f4de0656d9a208e03c544c439437d3551fa63cc3a3b0a56831c47e79a41688cc"} Apr 16 19:34:47.216414 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:34:47.216023 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-66c6fd6db6-r696d" Apr 16 19:34:47.216414 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:34:47.216031 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-66c6fd6db6-r696d" event={"ID":"119c9fd9-c6e0-496b-957a-0a9d1cca824b","Type":"ContainerDied","Data":"d70a471257d29e0c5405821bdecbde8becccc3c8ae2a5f29dee2ffffb25a8a98"} Apr 16 19:34:47.216414 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:34:47.216047 2580 scope.go:117] "RemoveContainer" containerID="f4de0656d9a208e03c544c439437d3551fa63cc3a3b0a56831c47e79a41688cc" Apr 16 19:34:47.223761 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:34:47.223747 2580 scope.go:117] "RemoveContainer" containerID="f4de0656d9a208e03c544c439437d3551fa63cc3a3b0a56831c47e79a41688cc" Apr 16 19:34:47.223996 ip-10-0-130-83 kubenswrapper[2580]: E0416 19:34:47.223974 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4de0656d9a208e03c544c439437d3551fa63cc3a3b0a56831c47e79a41688cc\": container with ID starting with f4de0656d9a208e03c544c439437d3551fa63cc3a3b0a56831c47e79a41688cc not found: ID does not exist" containerID="f4de0656d9a208e03c544c439437d3551fa63cc3a3b0a56831c47e79a41688cc" Apr 16 19:34:47.224100 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:34:47.224000 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4de0656d9a208e03c544c439437d3551fa63cc3a3b0a56831c47e79a41688cc"} err="failed to get container status \"f4de0656d9a208e03c544c439437d3551fa63cc3a3b0a56831c47e79a41688cc\": rpc error: code = NotFound desc = could not find container \"f4de0656d9a208e03c544c439437d3551fa63cc3a3b0a56831c47e79a41688cc\": container with ID starting with f4de0656d9a208e03c544c439437d3551fa63cc3a3b0a56831c47e79a41688cc not found: ID does not exist" Apr 16 19:34:47.235141 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:34:47.235117 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-66c6fd6db6-r696d"] Apr 16 19:34:47.238018 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:34:47.237999 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-66c6fd6db6-r696d"] Apr 16 19:34:49.086114 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:34:49.086075 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="119c9fd9-c6e0-496b-957a-0a9d1cca824b" path="/var/lib/kubelet/pods/119c9fd9-c6e0-496b-957a-0a9d1cca824b/volumes" Apr 16 19:38:13.044279 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:38:13.044248 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q4qdv_0244c6e9-6611-4147-8e78-0345faffa52e/ovn-acl-logging/0.log" Apr 16 19:38:13.047077 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:38:13.047059 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q4qdv_0244c6e9-6611-4147-8e78-0345faffa52e/ovn-acl-logging/0.log" Apr 16 19:43:13.066114 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:43:13.066083 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q4qdv_0244c6e9-6611-4147-8e78-0345faffa52e/ovn-acl-logging/0.log" Apr 16 19:43:13.069877 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:43:13.069853 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q4qdv_0244c6e9-6611-4147-8e78-0345faffa52e/ovn-acl-logging/0.log" Apr 16 19:48:13.088490 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:48:13.088455 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q4qdv_0244c6e9-6611-4147-8e78-0345faffa52e/ovn-acl-logging/0.log" Apr 16 19:48:13.092901 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:48:13.092882 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q4qdv_0244c6e9-6611-4147-8e78-0345faffa52e/ovn-acl-logging/0.log" Apr 16 19:53:13.108995 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:53:13.108967 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q4qdv_0244c6e9-6611-4147-8e78-0345faffa52e/ovn-acl-logging/0.log" Apr 16 19:53:13.113767 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:53:13.113742 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q4qdv_0244c6e9-6611-4147-8e78-0345faffa52e/ovn-acl-logging/0.log" Apr 16 19:55:47.549817 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:55:47.549742 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-775bb8958f-7f65r_92039fd9-eda7-455f-940e-57a5ffe3cba0/authorino/0.log" Apr 16 19:55:51.812835 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:55:51.812809 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-66b64c949f-kkqkp_45f8bb8d-c41a-49de-8234-c33c2db30686/manager/0.log" Apr 16 19:55:52.050167 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:55:52.050124 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-hlpc9_63e1aaf6-0990-425e-87c6-bacb2d2b4237/postgres/0.log" Apr 16 19:55:53.282675 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:55:53.282646 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-775bb8958f-7f65r_92039fd9-eda7-455f-940e-57a5ffe3cba0/authorino/0.log" Apr 16 19:55:53.504732 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:55:53.504678 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-mqxks_ed7f38da-c8d3-41f3-8ddf-9aa5d20126d5/manager/0.log" Apr 16 19:55:54.049262 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:55:54.049230 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-p4hdf_55d9ad56-b9ee-4b5b-b3b7-b283c1345cb4/manager/0.log" Apr 16 19:55:54.378340 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:55:54.378256 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq_daf7c7ae-16fc-40bf-a381-013352b8d1b0/istio-proxy/0.log" Apr 16 19:55:54.814926 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:55:54.814883 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-k5nss_c5891986-b5a8-4513-9eb2-21ab3a24ee40/istio-proxy/0.log" Apr 16 19:55:55.488966 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:55:55.488929 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-6d5965695-h965s_be291068-3189-4f00-8940-1b5f358598da/storage-initializer/0.log" Apr 16 19:55:55.495512 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:55:55.495487 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-6d5965695-h965s_be291068-3189-4f00-8940-1b5f358598da/main/0.log" Apr 16 19:55:55.600988 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:55:55.600961 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw_d55381cc-e604-4e60-a349-4aff29642410/storage-initializer/0.log" Apr 16 19:55:55.607983 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:55:55.607962 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccwbgpw_d55381cc-e604-4e60-a349-4aff29642410/main/0.log" Apr 16 19:55:55.824438 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:55:55.824350 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4_94269a68-4661-4d96-a9aa-00bff162ad5a/storage-initializer/0.log" Apr 16 19:55:55.830357 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:55:55.830335 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-f5df4587b-9vpq4_94269a68-4661-4d96-a9aa-00bff162ad5a/main/0.log" Apr 16 19:56:02.934377 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:02.934347 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-qwd72_a0ecccc6-8176-439c-ac99-888ce68e7bf3/global-pull-secret-syncer/0.log" Apr 16 19:56:02.992169 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:02.992134 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-5zbnv_f26e651f-cca2-4b49-b9c1-af63f23ad901/konnectivity-agent/0.log" Apr 16 19:56:03.122943 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:03.122900 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-83.ec2.internal_058aac4aaec6cd6ac195597fdcd3d1b7/haproxy/0.log" Apr 16 19:56:07.488649 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:07.488609 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-775bb8958f-7f65r_92039fd9-eda7-455f-940e-57a5ffe3cba0/authorino/0.log" Apr 16 19:56:07.543808 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:07.543754 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-mqxks_ed7f38da-c8d3-41f3-8ddf-9aa5d20126d5/manager/0.log" Apr 16 19:56:07.758079 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:07.757959 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-p4hdf_55d9ad56-b9ee-4b5b-b3b7-b283c1345cb4/manager/0.log" Apr 16 19:56:09.361516 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:09.361471 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-mql76_f792fa48-e0c2-4512-890b-752dbfc3ceaf/kube-state-metrics/0.log" Apr 16 19:56:09.381913 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:09.381887 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-mql76_f792fa48-e0c2-4512-890b-752dbfc3ceaf/kube-rbac-proxy-main/0.log" Apr 16 19:56:09.415329 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:09.415301 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-mql76_f792fa48-e0c2-4512-890b-752dbfc3ceaf/kube-rbac-proxy-self/0.log" Apr 16 19:56:09.475341 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:09.475317 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-bv259_5cd597be-9435-451a-9f7c-11d341f570a5/monitoring-plugin/0.log" Apr 16 19:56:09.506616 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:09.506584 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-mhtzt_e39342d6-d92b-4524-a119-2c56664bbc27/node-exporter/0.log" Apr 16 19:56:09.525506 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:09.525481 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-mhtzt_e39342d6-d92b-4524-a119-2c56664bbc27/kube-rbac-proxy/0.log" Apr 16 19:56:09.544830 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:09.544804 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-mhtzt_e39342d6-d92b-4524-a119-2c56664bbc27/init-textfile/0.log" Apr 16 19:56:10.019443 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:10.019396 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-f54jm_c9e5c2e8-1a16-46a7-8cdf-f98fc27ce9a6/prometheus-operator-admission-webhook/0.log" Apr 16 19:56:11.454502 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:11.454467 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-g4x49/perf-node-gather-daemonset-rg7h5"] Apr 16 19:56:11.454976 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:11.454957 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="119c9fd9-c6e0-496b-957a-0a9d1cca824b" containerName="manager" Apr 16 19:56:11.455046 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:11.454980 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="119c9fd9-c6e0-496b-957a-0a9d1cca824b" containerName="manager" Apr 16 19:56:11.455046 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:11.455009 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b82bf6e4-b659-444d-815d-ec2632766fc9" containerName="authorino" Apr 16 19:56:11.455046 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:11.455020 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b82bf6e4-b659-444d-815d-ec2632766fc9" containerName="authorino" Apr 16 19:56:11.455209 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:11.455131 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="119c9fd9-c6e0-496b-957a-0a9d1cca824b" containerName="manager" Apr 16 19:56:11.455209 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:11.455145 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="b82bf6e4-b659-444d-815d-ec2632766fc9" containerName="authorino" Apr 16 19:56:11.457960 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:11.457942 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-rg7h5" Apr 16 19:56:11.460152 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:11.460134 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-g4x49\"/\"kube-root-ca.crt\"" Apr 16 19:56:11.460280 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:11.460233 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-g4x49\"/\"default-dockercfg-42kbq\"" Apr 16 19:56:11.460900 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:11.460882 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-g4x49\"/\"openshift-service-ca.crt\"" Apr 16 19:56:11.465892 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:11.465873 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g4x49/perf-node-gather-daemonset-rg7h5"] Apr 16 19:56:11.518557 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:11.518534 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a7534d42-5173-4ac2-8630-4c3270782294-podres\") pod \"perf-node-gather-daemonset-rg7h5\" (UID: \"a7534d42-5173-4ac2-8630-4c3270782294\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-rg7h5" Apr 16 19:56:11.518669 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:11.518585 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a7534d42-5173-4ac2-8630-4c3270782294-sys\") pod \"perf-node-gather-daemonset-rg7h5\" (UID: \"a7534d42-5173-4ac2-8630-4c3270782294\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-rg7h5" Apr 16 19:56:11.518669 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:11.518601 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a7534d42-5173-4ac2-8630-4c3270782294-proc\") pod \"perf-node-gather-daemonset-rg7h5\" (UID: \"a7534d42-5173-4ac2-8630-4c3270782294\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-rg7h5" Apr 16 19:56:11.518669 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:11.518655 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a7534d42-5173-4ac2-8630-4c3270782294-lib-modules\") pod \"perf-node-gather-daemonset-rg7h5\" (UID: \"a7534d42-5173-4ac2-8630-4c3270782294\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-rg7h5" Apr 16 19:56:11.518799 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:11.518688 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkr8f\" (UniqueName: \"kubernetes.io/projected/a7534d42-5173-4ac2-8630-4c3270782294-kube-api-access-wkr8f\") pod \"perf-node-gather-daemonset-rg7h5\" (UID: \"a7534d42-5173-4ac2-8630-4c3270782294\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-rg7h5" Apr 16 19:56:11.619453 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:11.619416 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a7534d42-5173-4ac2-8630-4c3270782294-podres\") pod \"perf-node-gather-daemonset-rg7h5\" (UID: \"a7534d42-5173-4ac2-8630-4c3270782294\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-rg7h5" Apr 16 19:56:11.619640 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:11.619489 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a7534d42-5173-4ac2-8630-4c3270782294-sys\") pod \"perf-node-gather-daemonset-rg7h5\" (UID: \"a7534d42-5173-4ac2-8630-4c3270782294\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-rg7h5" Apr 16 19:56:11.619640 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:11.619517 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a7534d42-5173-4ac2-8630-4c3270782294-proc\") pod \"perf-node-gather-daemonset-rg7h5\" (UID: \"a7534d42-5173-4ac2-8630-4c3270782294\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-rg7h5" Apr 16 19:56:11.619640 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:11.619553 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a7534d42-5173-4ac2-8630-4c3270782294-lib-modules\") pod \"perf-node-gather-daemonset-rg7h5\" (UID: \"a7534d42-5173-4ac2-8630-4c3270782294\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-rg7h5" Apr 16 19:56:11.619640 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:11.619587 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkr8f\" (UniqueName: \"kubernetes.io/projected/a7534d42-5173-4ac2-8630-4c3270782294-kube-api-access-wkr8f\") pod \"perf-node-gather-daemonset-rg7h5\" (UID: \"a7534d42-5173-4ac2-8630-4c3270782294\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-rg7h5" Apr 16 19:56:11.619640 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:11.619594 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a7534d42-5173-4ac2-8630-4c3270782294-podres\") pod \"perf-node-gather-daemonset-rg7h5\" (UID: \"a7534d42-5173-4ac2-8630-4c3270782294\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-rg7h5" Apr 16 19:56:11.619640 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:11.619601 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a7534d42-5173-4ac2-8630-4c3270782294-sys\") pod \"perf-node-gather-daemonset-rg7h5\" (UID: \"a7534d42-5173-4ac2-8630-4c3270782294\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-rg7h5" Apr 16 19:56:11.619640 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:11.619627 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a7534d42-5173-4ac2-8630-4c3270782294-proc\") pod \"perf-node-gather-daemonset-rg7h5\" (UID: \"a7534d42-5173-4ac2-8630-4c3270782294\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-rg7h5" Apr 16 19:56:11.619897 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:11.619680 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a7534d42-5173-4ac2-8630-4c3270782294-lib-modules\") pod \"perf-node-gather-daemonset-rg7h5\" (UID: \"a7534d42-5173-4ac2-8630-4c3270782294\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-rg7h5" Apr 16 19:56:11.627868 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:11.627842 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkr8f\" (UniqueName: \"kubernetes.io/projected/a7534d42-5173-4ac2-8630-4c3270782294-kube-api-access-wkr8f\") pod \"perf-node-gather-daemonset-rg7h5\" (UID: \"a7534d42-5173-4ac2-8630-4c3270782294\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-rg7h5" Apr 16 19:56:11.769331 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:11.769247 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-rg7h5" Apr 16 19:56:11.885024 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:11.884994 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g4x49/perf-node-gather-daemonset-rg7h5"] Apr 16 19:56:11.888578 ip-10-0-130-83 kubenswrapper[2580]: W0416 19:56:11.888552 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda7534d42_5173_4ac2_8630_4c3270782294.slice/crio-c25e2e1a3279df0a03988cb053f48c5764d6326e2a2889be8273955012904f35 WatchSource:0}: Error finding container c25e2e1a3279df0a03988cb053f48c5764d6326e2a2889be8273955012904f35: Status 404 returned error can't find the container with id c25e2e1a3279df0a03988cb053f48c5764d6326e2a2889be8273955012904f35 Apr 16 19:56:11.890145 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:11.890123 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:56:12.417291 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:12.417221 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-rg7h5" event={"ID":"a7534d42-5173-4ac2-8630-4c3270782294","Type":"ContainerStarted","Data":"2a606fb5307b69ffe0d3afbd036137ef1945ca2f78ce7fdd7b3d942bf89b0a3a"} Apr 16 19:56:12.417457 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:12.417296 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-rg7h5" event={"ID":"a7534d42-5173-4ac2-8630-4c3270782294","Type":"ContainerStarted","Data":"c25e2e1a3279df0a03988cb053f48c5764d6326e2a2889be8273955012904f35"} Apr 16 19:56:12.417457 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:12.417378 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-rg7h5" Apr 16 19:56:12.420474 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:12.420454 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-5lqq4_ce85330e-a89a-4e63-b5d1-7af0e5319b88/download-server/0.log" Apr 16 19:56:12.432970 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:12.432932 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-rg7h5" podStartSLOduration=1.432920954 podStartE2EDuration="1.432920954s" podCreationTimestamp="2026-04-16 19:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:56:12.431497097 +0000 UTC m=+2279.918694930" watchObservedRunningTime="2026-04-16 19:56:12.432920954 +0000 UTC m=+2279.920118786" Apr 16 19:56:13.690991 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:13.690959 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-2pfdm_f5d78334-d61f-4f3e-878c-9726541364d0/dns/0.log" Apr 16 19:56:13.712233 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:13.712202 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-2pfdm_f5d78334-d61f-4f3e-878c-9726541364d0/kube-rbac-proxy/0.log" Apr 16 19:56:13.829141 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:13.829114 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-8khcf_fad29fdb-1d79-448d-b40e-0652c1dcf698/dns-node-resolver/0.log" Apr 16 19:56:14.395510 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:14.395482 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-zmn8r_8a90f765-2c10-429d-99f6-bbcf7122c7a0/node-ca/0.log" Apr 16 19:56:15.209751 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:15.209727 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfvmkwq_daf7c7ae-16fc-40bf-a381-013352b8d1b0/istio-proxy/0.log" Apr 16 19:56:15.405171 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:15.405143 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-k5nss_c5891986-b5a8-4513-9eb2-21ab3a24ee40/istio-proxy/0.log" Apr 16 19:56:15.914350 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:15.914319 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-6pw99_512ebef3-162f-4664-8e29-302b9cbd3861/serve-healthcheck-canary/0.log" Apr 16 19:56:16.427657 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:16.427625 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6sxn2_c706952f-9a60-44db-b459-bd44650c58c3/kube-rbac-proxy/0.log" Apr 16 19:56:16.447123 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:16.447096 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6sxn2_c706952f-9a60-44db-b459-bd44650c58c3/exporter/0.log" Apr 16 19:56:16.467884 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:16.467863 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6sxn2_c706952f-9a60-44db-b459-bd44650c58c3/extractor/0.log" Apr 16 19:56:18.430487 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:18.430456 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-rg7h5" Apr 16 19:56:18.668403 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:18.668377 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-66b64c949f-kkqkp_45f8bb8d-c41a-49de-8234-c33c2db30686/manager/0.log" Apr 16 19:56:18.714242 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:18.714221 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-hlpc9_63e1aaf6-0990-425e-87c6-bacb2d2b4237/postgres/0.log" Apr 16 19:56:19.951693 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:19.951667 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-66b4cb6588-4h2d2_c6dea8fc-3f21-4202-975b-5d3b715d3b90/manager/0.log" Apr 16 19:56:25.887655 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:25.887619 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9pgnh_30909a2b-a27c-4b44-8a1b-c23e90999d15/kube-multus-additional-cni-plugins/0.log" Apr 16 19:56:25.908777 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:25.908750 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9pgnh_30909a2b-a27c-4b44-8a1b-c23e90999d15/egress-router-binary-copy/0.log" Apr 16 19:56:25.928383 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:25.928363 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9pgnh_30909a2b-a27c-4b44-8a1b-c23e90999d15/cni-plugins/0.log" Apr 16 19:56:25.950453 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:25.950433 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9pgnh_30909a2b-a27c-4b44-8a1b-c23e90999d15/bond-cni-plugin/0.log" Apr 16 19:56:25.970530 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:25.970509 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9pgnh_30909a2b-a27c-4b44-8a1b-c23e90999d15/routeoverride-cni/0.log" Apr 16 19:56:25.990731 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:25.990712 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9pgnh_30909a2b-a27c-4b44-8a1b-c23e90999d15/whereabouts-cni-bincopy/0.log" Apr 16 19:56:26.010843 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:26.010829 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9pgnh_30909a2b-a27c-4b44-8a1b-c23e90999d15/whereabouts-cni/0.log" Apr 16 19:56:26.391614 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:26.391588 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pjfwp_030c4af3-2776-4d57-94f4-7fb7b885e5e4/kube-multus/0.log" Apr 16 19:56:26.438791 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:26.438768 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-hj7p9_c79c97f4-34fe-4b2b-9f22-401688c77d79/network-metrics-daemon/0.log" Apr 16 19:56:26.457883 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:26.457861 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-hj7p9_c79c97f4-34fe-4b2b-9f22-401688c77d79/kube-rbac-proxy/0.log" Apr 16 19:56:27.913910 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:27.913885 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q4qdv_0244c6e9-6611-4147-8e78-0345faffa52e/ovn-controller/0.log" Apr 16 19:56:27.931362 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:27.931336 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q4qdv_0244c6e9-6611-4147-8e78-0345faffa52e/ovn-acl-logging/0.log" Apr 16 19:56:27.940664 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:27.940647 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q4qdv_0244c6e9-6611-4147-8e78-0345faffa52e/ovn-acl-logging/1.log" Apr 16 19:56:27.958588 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:27.958567 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q4qdv_0244c6e9-6611-4147-8e78-0345faffa52e/kube-rbac-proxy-node/0.log" Apr 16 19:56:27.982600 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:27.982580 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q4qdv_0244c6e9-6611-4147-8e78-0345faffa52e/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 19:56:28.002182 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:28.002162 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q4qdv_0244c6e9-6611-4147-8e78-0345faffa52e/northd/0.log" Apr 16 19:56:28.022507 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:28.022483 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q4qdv_0244c6e9-6611-4147-8e78-0345faffa52e/nbdb/0.log" Apr 16 19:56:28.044080 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:28.044064 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q4qdv_0244c6e9-6611-4147-8e78-0345faffa52e/sbdb/0.log" Apr 16 19:56:28.134127 ip-10-0-130-83 kubenswrapper[2580]: I0416 19:56:28.134098 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q4qdv_0244c6e9-6611-4147-8e78-0345faffa52e/ovnkube-controller/0.log"