Apr 20 21:10:30.610729 ip-10-0-129-149 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 20 21:10:30.610739 ip-10-0-129-149 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 20 21:10:30.610746 ip-10-0-129-149 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 20 21:10:30.610975 ip-10-0-129-149 systemd[1]: Failed to start Kubernetes Kubelet. Apr 20 21:10:40.792646 ip-10-0-129-149 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 20 21:10:40.792668 ip-10-0-129-149 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 68cf6a7231dd42928b899e42d9c13c4d -- Apr 20 21:13:09.640460 ip-10-0-129-149 systemd[1]: Starting Kubernetes Kubelet... Apr 20 21:13:10.065243 ip-10-0-129-149 kubenswrapper[2571]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 21:13:10.065243 ip-10-0-129-149 kubenswrapper[2571]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 21:13:10.065243 ip-10-0-129-149 kubenswrapper[2571]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 21:13:10.065243 ip-10-0-129-149 kubenswrapper[2571]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 21:13:10.065243 ip-10-0-129-149 kubenswrapper[2571]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 21:13:10.066555 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.066469 2571 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 21:13:10.071393 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071378 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 21:13:10.071393 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071394 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 21:13:10.071462 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071398 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 21:13:10.071462 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071401 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 21:13:10.071462 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071404 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 21:13:10.071462 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071407 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 21:13:10.071462 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071409 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 21:13:10.071462 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071412 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 21:13:10.071462 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071415 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 21:13:10.071462 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071417 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 21:13:10.071462 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071420 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 21:13:10.071462 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071422 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 21:13:10.071462 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071425 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 21:13:10.071462 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071427 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 21:13:10.071462 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071430 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 21:13:10.071462 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071434 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 21:13:10.071462 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071438 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 21:13:10.071462 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071441 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 21:13:10.071462 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071444 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 21:13:10.071462 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071447 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 21:13:10.071462 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071454 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 21:13:10.071943 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071456 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 21:13:10.071943 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071459 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 21:13:10.071943 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071461 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 21:13:10.071943 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071464 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 21:13:10.071943 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071467 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 21:13:10.071943 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071470 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 21:13:10.071943 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071472 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 21:13:10.071943 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071475 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 21:13:10.071943 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071478 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 21:13:10.071943 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071480 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 21:13:10.071943 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071483 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 21:13:10.071943 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071485 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 21:13:10.071943 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071488 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 21:13:10.071943 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071490 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 21:13:10.071943 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071493 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 21:13:10.071943 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071495 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 21:13:10.071943 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071498 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 21:13:10.071943 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071500 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 21:13:10.071943 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071502 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 21:13:10.071943 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071505 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 21:13:10.072488 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071507 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 21:13:10.072488 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071510 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 21:13:10.072488 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071512 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 21:13:10.072488 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071515 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 21:13:10.072488 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071517 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 21:13:10.072488 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071520 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 21:13:10.072488 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071522 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 21:13:10.072488 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071524 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 21:13:10.072488 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071527 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 21:13:10.072488 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071529 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 21:13:10.072488 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071531 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 21:13:10.072488 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071534 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 21:13:10.072488 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071537 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 21:13:10.072488 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071541 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 21:13:10.072488 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071543 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 21:13:10.072488 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071546 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 21:13:10.072488 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071548 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 21:13:10.072488 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071551 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 21:13:10.072488 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071553 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 21:13:10.072488 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071555 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 21:13:10.072973 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071558 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 21:13:10.072973 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071560 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 21:13:10.072973 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071563 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 21:13:10.072973 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071566 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 21:13:10.072973 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071568 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 21:13:10.072973 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071571 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 21:13:10.072973 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071573 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 21:13:10.072973 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071576 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 21:13:10.072973 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071578 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 20 21:13:10.072973 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071581 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 21:13:10.072973 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071585 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 21:13:10.072973 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071588 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 21:13:10.072973 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071591 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 21:13:10.072973 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071594 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 21:13:10.072973 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071598 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 21:13:10.072973 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071600 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 21:13:10.072973 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071603 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 21:13:10.072973 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071605 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 21:13:10.072973 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071608 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 21:13:10.073442 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071610 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 21:13:10.073442 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071613 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 21:13:10.073442 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071615 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 21:13:10.073442 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071618 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 21:13:10.073442 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071620 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 21:13:10.073442 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.071622 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 21:13:10.073442 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072033 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 21:13:10.073442 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072038 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 21:13:10.073442 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072041 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 21:13:10.073442 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072044 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 21:13:10.073442 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072048 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 21:13:10.073442 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072050 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 21:13:10.073442 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072053 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 21:13:10.073442 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072056 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 21:13:10.073442 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072058 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 21:13:10.073442 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072061 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 21:13:10.073442 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072064 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 21:13:10.073442 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072066 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 21:13:10.073442 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072069 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 21:13:10.073442 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072071 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 21:13:10.073921 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072074 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 21:13:10.073921 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072076 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 21:13:10.073921 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072079 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 21:13:10.073921 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072081 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 21:13:10.073921 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072084 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 21:13:10.073921 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072086 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 21:13:10.073921 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072089 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 21:13:10.073921 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072092 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 21:13:10.073921 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072095 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 21:13:10.073921 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072097 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 21:13:10.073921 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072100 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 21:13:10.073921 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072102 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 21:13:10.073921 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072105 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 21:13:10.073921 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072107 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 21:13:10.073921 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072110 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 21:13:10.073921 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072112 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 21:13:10.073921 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072114 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 21:13:10.073921 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072117 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 21:13:10.073921 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072120 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 21:13:10.073921 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072123 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 21:13:10.074573 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072125 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 21:13:10.074573 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072128 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 21:13:10.074573 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072130 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 21:13:10.074573 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072133 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 21:13:10.074573 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072135 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 21:13:10.074573 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072138 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 21:13:10.074573 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072140 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 21:13:10.074573 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072142 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 21:13:10.074573 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072145 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 21:13:10.074573 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072147 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 21:13:10.074573 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072149 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 21:13:10.074573 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072152 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 21:13:10.074573 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072154 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 21:13:10.074573 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072157 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 21:13:10.074573 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072159 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 21:13:10.074573 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072164 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 21:13:10.074573 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072167 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 21:13:10.074573 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072170 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 21:13:10.074573 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072173 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 21:13:10.074573 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072191 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 21:13:10.075067 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072195 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 20 21:13:10.075067 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072198 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 21:13:10.075067 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072200 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 21:13:10.075067 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072203 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 21:13:10.075067 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072206 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 21:13:10.075067 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072209 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 21:13:10.075067 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072211 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 21:13:10.075067 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072214 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 21:13:10.075067 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072217 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 21:13:10.075067 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072220 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 21:13:10.075067 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072224 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 21:13:10.075067 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072242 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 21:13:10.075067 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072245 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 21:13:10.075067 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072248 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 21:13:10.075067 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072251 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 21:13:10.075067 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072253 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 21:13:10.075067 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072256 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 21:13:10.075067 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072259 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 21:13:10.075067 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072262 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 21:13:10.075555 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072264 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 21:13:10.075555 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072267 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 21:13:10.075555 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072269 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 21:13:10.075555 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072272 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 21:13:10.075555 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072275 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 21:13:10.075555 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072277 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 21:13:10.075555 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072279 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 21:13:10.075555 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072282 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 21:13:10.075555 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072284 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 21:13:10.075555 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072287 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 21:13:10.075555 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072289 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 21:13:10.075555 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072291 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 21:13:10.075555 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072294 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 21:13:10.075555 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072360 2571 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 21:13:10.075555 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072367 2571 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 21:13:10.075555 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072374 2571 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 21:13:10.075555 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072378 2571 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 21:13:10.075555 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072383 2571 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 21:13:10.075555 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072386 2571 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 21:13:10.075555 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072392 2571 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 21:13:10.075555 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072399 2571 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 21:13:10.076062 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072403 2571 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 21:13:10.076062 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072408 2571 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 21:13:10.076062 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072415 2571 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 21:13:10.076062 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072419 2571 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 21:13:10.076062 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072422 2571 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 21:13:10.076062 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072426 2571 flags.go:64] FLAG: --cgroup-root="" Apr 20 21:13:10.076062 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072429 2571 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 21:13:10.076062 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072432 2571 flags.go:64] FLAG: --client-ca-file="" Apr 20 21:13:10.076062 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072435 2571 flags.go:64] FLAG: --cloud-config="" Apr 20 21:13:10.076062 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072438 2571 flags.go:64] FLAG: --cloud-provider="external" Apr 20 21:13:10.076062 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072441 2571 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 21:13:10.076062 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072446 2571 flags.go:64] FLAG: --cluster-domain="" Apr 20 21:13:10.076062 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072449 2571 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 21:13:10.076062 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072452 2571 flags.go:64] FLAG: --config-dir="" Apr 20 21:13:10.076062 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072454 2571 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 21:13:10.076062 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072458 2571 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 21:13:10.076062 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072462 2571 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 21:13:10.076062 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072464 2571 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 21:13:10.076062 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072467 2571 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 21:13:10.076062 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072471 2571 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 21:13:10.076062 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072474 2571 flags.go:64] FLAG: --contention-profiling="false" Apr 20 21:13:10.076062 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072477 2571 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 21:13:10.076062 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072480 2571 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 21:13:10.076062 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072483 2571 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 21:13:10.076062 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072485 2571 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 21:13:10.076693 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072490 2571 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 21:13:10.076693 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072494 2571 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 21:13:10.076693 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072497 2571 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 21:13:10.076693 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072500 2571 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 21:13:10.076693 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072504 2571 flags.go:64] FLAG: --enable-server="true" Apr 20 21:13:10.076693 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072507 2571 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 21:13:10.076693 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072511 2571 flags.go:64] FLAG: --event-burst="100" Apr 20 21:13:10.076693 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072514 2571 flags.go:64] FLAG: --event-qps="50" Apr 20 21:13:10.076693 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072517 2571 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 21:13:10.076693 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072520 2571 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 21:13:10.076693 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072523 2571 flags.go:64] FLAG: --eviction-hard="" Apr 20 21:13:10.076693 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072527 2571 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 21:13:10.076693 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072530 2571 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 21:13:10.076693 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072533 2571 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 21:13:10.076693 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072536 2571 flags.go:64] FLAG: --eviction-soft="" Apr 20 21:13:10.076693 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072539 2571 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 21:13:10.076693 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072542 2571 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 21:13:10.076693 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072545 2571 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 21:13:10.076693 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072548 2571 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 21:13:10.076693 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072550 2571 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 21:13:10.076693 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072553 2571 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 21:13:10.076693 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072556 2571 flags.go:64] FLAG: --feature-gates="" Apr 20 21:13:10.076693 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072562 2571 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 21:13:10.076693 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072565 2571 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 21:13:10.076693 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072568 2571 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 21:13:10.077356 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072571 2571 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 21:13:10.077356 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072575 2571 flags.go:64] FLAG: --healthz-port="10248" Apr 20 21:13:10.077356 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072578 2571 flags.go:64] FLAG: --help="false" Apr 20 21:13:10.077356 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072581 2571 flags.go:64] FLAG: --hostname-override="ip-10-0-129-149.ec2.internal" Apr 20 21:13:10.077356 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072584 2571 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 21:13:10.077356 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072587 2571 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 21:13:10.077356 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072590 2571 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 21:13:10.077356 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072593 2571 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 21:13:10.077356 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072600 2571 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 21:13:10.077356 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072603 2571 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 21:13:10.077356 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072606 2571 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 21:13:10.077356 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072609 2571 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 21:13:10.077356 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072612 2571 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 21:13:10.077356 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072615 2571 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 21:13:10.077356 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072618 2571 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 21:13:10.077356 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072620 2571 flags.go:64] FLAG: --kube-reserved="" Apr 20 21:13:10.077356 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072623 2571 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 21:13:10.077356 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072626 2571 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 21:13:10.077356 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072629 2571 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 21:13:10.077356 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072632 2571 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 21:13:10.077356 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072634 2571 flags.go:64] FLAG: --lock-file="" Apr 20 21:13:10.077356 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072637 2571 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 21:13:10.077356 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072640 2571 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 21:13:10.077356 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072642 2571 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 21:13:10.077934 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072648 2571 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 21:13:10.077934 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072651 2571 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 21:13:10.077934 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072653 2571 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 21:13:10.077934 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072656 2571 flags.go:64] FLAG: --logging-format="text" Apr 20 21:13:10.077934 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072659 2571 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 21:13:10.077934 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072662 2571 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 21:13:10.077934 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072665 2571 flags.go:64] FLAG: --manifest-url="" Apr 20 21:13:10.077934 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072667 2571 flags.go:64] FLAG: --manifest-url-header="" Apr 20 21:13:10.077934 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072672 2571 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 21:13:10.077934 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072675 2571 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 21:13:10.077934 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072679 2571 flags.go:64] FLAG: --max-pods="110" Apr 20 21:13:10.077934 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072682 2571 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 21:13:10.077934 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072685 2571 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 21:13:10.077934 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072688 2571 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 21:13:10.077934 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072690 2571 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 21:13:10.077934 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072693 2571 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 21:13:10.077934 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072698 2571 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 21:13:10.077934 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072701 2571 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 21:13:10.077934 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072709 2571 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 21:13:10.077934 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072712 2571 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 21:13:10.077934 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072715 2571 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 21:13:10.077934 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072718 2571 flags.go:64] FLAG: --pod-cidr="" Apr 20 21:13:10.077934 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072721 2571 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 21:13:10.078514 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072727 2571 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 21:13:10.078514 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072730 2571 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 21:13:10.078514 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072733 2571 flags.go:64] FLAG: --pods-per-core="0" Apr 20 21:13:10.078514 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072735 2571 flags.go:64] FLAG: --port="10250" Apr 20 21:13:10.078514 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072738 2571 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 21:13:10.078514 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072741 2571 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-00b9284f1ce333dec" Apr 20 21:13:10.078514 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072744 2571 flags.go:64] FLAG: --qos-reserved="" Apr 20 21:13:10.078514 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072747 2571 flags.go:64] FLAG: --read-only-port="10255" Apr 20 21:13:10.078514 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072750 2571 flags.go:64] FLAG: --register-node="true" Apr 20 21:13:10.078514 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072753 2571 flags.go:64] FLAG: --register-schedulable="true" Apr 20 21:13:10.078514 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072756 2571 flags.go:64] FLAG: --register-with-taints="" Apr 20 21:13:10.078514 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072759 2571 flags.go:64] FLAG: --registry-burst="10" Apr 20 21:13:10.078514 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072762 2571 flags.go:64] FLAG: --registry-qps="5" Apr 20 21:13:10.078514 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072764 2571 flags.go:64] FLAG: --reserved-cpus="" Apr 20 21:13:10.078514 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072767 2571 flags.go:64] FLAG: --reserved-memory="" Apr 20 21:13:10.078514 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072771 2571 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 21:13:10.078514 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072774 2571 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 21:13:10.078514 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072777 2571 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 21:13:10.078514 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072784 2571 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 21:13:10.078514 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072787 2571 flags.go:64] FLAG: --runonce="false" Apr 20 21:13:10.078514 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072790 2571 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 21:13:10.078514 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072793 2571 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 21:13:10.078514 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072796 2571 flags.go:64] FLAG: --seccomp-default="false" Apr 20 21:13:10.078514 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072798 2571 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 21:13:10.078514 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072801 2571 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 21:13:10.078514 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072804 2571 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 21:13:10.079136 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072808 2571 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 21:13:10.079136 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072811 2571 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 21:13:10.079136 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072815 2571 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 21:13:10.079136 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072818 2571 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 21:13:10.079136 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072821 2571 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 21:13:10.079136 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072823 2571 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 21:13:10.079136 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072826 2571 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 21:13:10.079136 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072829 2571 flags.go:64] FLAG: --system-cgroups="" Apr 20 21:13:10.079136 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072832 2571 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 21:13:10.079136 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072837 2571 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 21:13:10.079136 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072840 2571 flags.go:64] FLAG: --tls-cert-file="" Apr 20 21:13:10.079136 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072843 2571 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 21:13:10.079136 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072847 2571 flags.go:64] FLAG: --tls-min-version="" Apr 20 21:13:10.079136 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072850 2571 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 21:13:10.079136 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072853 2571 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 21:13:10.079136 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072855 2571 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 21:13:10.079136 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072858 2571 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 21:13:10.079136 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072861 2571 flags.go:64] FLAG: --v="2" Apr 20 21:13:10.079136 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072865 2571 flags.go:64] FLAG: --version="false" Apr 20 21:13:10.079136 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072869 2571 flags.go:64] FLAG: --vmodule="" Apr 20 21:13:10.079136 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072873 2571 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 21:13:10.079136 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.072876 2571 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 21:13:10.079136 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072962 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 21:13:10.079136 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072965 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 21:13:10.079776 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072969 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 21:13:10.079776 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072972 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 21:13:10.079776 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072975 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 21:13:10.079776 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072978 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 21:13:10.079776 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072980 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 21:13:10.079776 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072983 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 21:13:10.079776 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072986 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 21:13:10.079776 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072988 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 21:13:10.079776 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072993 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 21:13:10.079776 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072995 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 21:13:10.079776 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.072998 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 21:13:10.079776 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073001 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 21:13:10.079776 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073004 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 21:13:10.079776 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073006 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 21:13:10.079776 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073009 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 21:13:10.079776 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073011 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 21:13:10.079776 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073013 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 21:13:10.079776 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073016 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 21:13:10.079776 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073019 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 21:13:10.079776 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073021 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 21:13:10.080306 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073024 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 21:13:10.080306 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073026 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 21:13:10.080306 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073029 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 21:13:10.080306 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073031 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 21:13:10.080306 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073034 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 21:13:10.080306 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073036 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 21:13:10.080306 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073038 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 21:13:10.080306 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073041 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 21:13:10.080306 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073045 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 21:13:10.080306 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073049 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 21:13:10.080306 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073051 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 21:13:10.080306 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073054 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 21:13:10.080306 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073058 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 21:13:10.080306 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073061 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 21:13:10.080306 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073064 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 21:13:10.080306 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073066 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 21:13:10.080306 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073069 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 21:13:10.080306 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073072 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 21:13:10.080306 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073074 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 21:13:10.080783 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073077 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 21:13:10.080783 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073080 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 21:13:10.080783 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073083 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 21:13:10.080783 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073085 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 21:13:10.080783 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073087 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 21:13:10.080783 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073090 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 21:13:10.080783 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073093 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 21:13:10.080783 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073095 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 21:13:10.080783 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073099 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 21:13:10.080783 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073102 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 20 21:13:10.080783 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073105 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 21:13:10.080783 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073108 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 21:13:10.080783 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073110 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 21:13:10.080783 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073112 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 21:13:10.080783 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073115 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 21:13:10.080783 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073117 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 21:13:10.080783 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073120 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 21:13:10.080783 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073122 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 21:13:10.080783 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073125 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 21:13:10.080783 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073127 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 21:13:10.081300 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073129 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 21:13:10.081300 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073132 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 21:13:10.081300 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073134 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 21:13:10.081300 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073137 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 21:13:10.081300 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073139 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 21:13:10.081300 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073143 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 21:13:10.081300 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073145 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 21:13:10.081300 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073148 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 21:13:10.081300 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073150 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 21:13:10.081300 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073153 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 21:13:10.081300 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073155 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 21:13:10.081300 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073157 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 21:13:10.081300 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073160 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 21:13:10.081300 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073163 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 21:13:10.081300 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073166 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 21:13:10.081300 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073168 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 21:13:10.081300 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073171 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 21:13:10.081300 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073189 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 21:13:10.081300 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073193 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 21:13:10.081300 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073196 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 21:13:10.081795 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073198 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 21:13:10.081795 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073201 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 21:13:10.081795 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073204 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 21:13:10.081795 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073206 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 21:13:10.081795 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.073209 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 21:13:10.081795 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.074081 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 21:13:10.081795 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.081497 2571 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 21:13:10.081795 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.081512 2571 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 21:13:10.081795 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081561 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 21:13:10.081795 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081568 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 21:13:10.081795 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081571 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 21:13:10.081795 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081574 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 21:13:10.081795 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081577 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 21:13:10.081795 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081579 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 21:13:10.081795 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081582 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 21:13:10.081795 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081584 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 21:13:10.082203 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081587 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 21:13:10.082203 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081590 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 21:13:10.082203 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081593 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 21:13:10.082203 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081595 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 21:13:10.082203 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081598 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 21:13:10.082203 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081600 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 21:13:10.082203 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081603 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 21:13:10.082203 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081606 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 21:13:10.082203 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081608 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 21:13:10.082203 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081610 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 21:13:10.082203 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081613 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 21:13:10.082203 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081615 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 21:13:10.082203 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081618 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 21:13:10.082203 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081621 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 21:13:10.082203 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081624 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 21:13:10.082203 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081626 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 21:13:10.082203 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081629 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 21:13:10.082203 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081633 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 21:13:10.082203 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081637 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 20 21:13:10.082203 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081640 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 21:13:10.082708 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081642 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 21:13:10.082708 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081645 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 21:13:10.082708 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081648 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 21:13:10.082708 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081652 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 21:13:10.082708 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081654 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 21:13:10.082708 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081657 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 21:13:10.082708 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081659 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 21:13:10.082708 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081662 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 21:13:10.082708 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081664 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 21:13:10.082708 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081667 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 21:13:10.082708 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081670 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 21:13:10.082708 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081672 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 21:13:10.082708 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081675 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 21:13:10.082708 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081678 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 21:13:10.082708 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081680 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 21:13:10.082708 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081682 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 21:13:10.082708 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081685 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 21:13:10.082708 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081687 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 21:13:10.082708 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081690 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 21:13:10.082708 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081693 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 21:13:10.083242 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081697 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 21:13:10.083242 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081700 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 21:13:10.083242 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081702 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 21:13:10.083242 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081705 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 21:13:10.083242 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081708 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 21:13:10.083242 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081710 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 21:13:10.083242 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081713 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 21:13:10.083242 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081719 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 21:13:10.083242 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081722 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 21:13:10.083242 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081724 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 21:13:10.083242 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081727 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 21:13:10.083242 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081729 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 21:13:10.083242 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081732 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 21:13:10.083242 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081735 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 21:13:10.083242 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081737 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 21:13:10.083242 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081740 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 21:13:10.083242 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081743 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 21:13:10.083242 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081746 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 21:13:10.083242 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081749 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 21:13:10.083718 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081751 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 21:13:10.083718 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081753 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 21:13:10.083718 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081756 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 21:13:10.083718 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081759 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 21:13:10.083718 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081761 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 21:13:10.083718 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081764 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 21:13:10.083718 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081766 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 21:13:10.083718 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081769 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 21:13:10.083718 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081771 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 21:13:10.083718 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081774 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 21:13:10.083718 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081776 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 21:13:10.083718 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081778 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 21:13:10.083718 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081780 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 21:13:10.083718 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081783 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 21:13:10.083718 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081785 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 21:13:10.083718 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081788 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 21:13:10.083718 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081791 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 21:13:10.083718 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081793 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 21:13:10.083718 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081796 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 21:13:10.084203 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.081801 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 21:13:10.084203 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081913 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 21:13:10.084203 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081921 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 21:13:10.084203 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081924 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 21:13:10.084203 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081928 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 21:13:10.084203 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081930 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 21:13:10.084203 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081933 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 21:13:10.084203 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081935 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 21:13:10.084203 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081939 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 21:13:10.084203 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081943 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 21:13:10.084203 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081946 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 21:13:10.084203 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081948 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 21:13:10.084203 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081951 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 21:13:10.084203 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081954 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 21:13:10.084203 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081956 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 21:13:10.084584 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081959 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 21:13:10.084584 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081962 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 21:13:10.084584 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081965 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 21:13:10.084584 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081968 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 21:13:10.084584 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081970 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 21:13:10.084584 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081973 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 21:13:10.084584 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081976 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 21:13:10.084584 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081979 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 21:13:10.084584 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081981 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 21:13:10.084584 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081984 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 21:13:10.084584 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081986 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 21:13:10.084584 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081989 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 21:13:10.084584 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081991 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 21:13:10.084584 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081994 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 21:13:10.084584 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081996 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 21:13:10.084584 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.081999 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 21:13:10.084584 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082001 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 21:13:10.084584 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082004 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 21:13:10.084584 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082007 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 21:13:10.085040 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082011 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 21:13:10.085040 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082019 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 21:13:10.085040 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082022 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 21:13:10.085040 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082025 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 21:13:10.085040 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082027 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 21:13:10.085040 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082030 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 21:13:10.085040 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082032 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 21:13:10.085040 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082035 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 21:13:10.085040 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082037 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 21:13:10.085040 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082039 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 21:13:10.085040 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082042 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 21:13:10.085040 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082044 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 21:13:10.085040 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082047 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 21:13:10.085040 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082049 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 21:13:10.085040 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082051 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 21:13:10.085040 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082054 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 20 21:13:10.085040 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082056 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 21:13:10.085040 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082059 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 21:13:10.085040 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082061 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 21:13:10.085040 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082063 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 21:13:10.085620 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082066 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 21:13:10.085620 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082068 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 21:13:10.085620 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082071 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 21:13:10.085620 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082073 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 21:13:10.085620 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082075 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 21:13:10.085620 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082078 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 21:13:10.085620 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082080 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 21:13:10.085620 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082083 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 21:13:10.085620 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082085 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 21:13:10.085620 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082088 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 21:13:10.085620 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082090 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 21:13:10.085620 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082092 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 21:13:10.085620 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082095 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 21:13:10.085620 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082097 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 21:13:10.085620 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082106 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 21:13:10.085620 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082108 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 21:13:10.085620 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082111 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 21:13:10.085620 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082113 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 21:13:10.085620 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082116 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 21:13:10.085620 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082118 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 21:13:10.086093 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082120 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 21:13:10.086093 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082123 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 21:13:10.086093 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082125 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 21:13:10.086093 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082128 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 21:13:10.086093 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082130 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 21:13:10.086093 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082133 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 21:13:10.086093 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082135 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 21:13:10.086093 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082137 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 21:13:10.086093 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082140 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 21:13:10.086093 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082142 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 21:13:10.086093 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082145 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 21:13:10.086093 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082147 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 21:13:10.086093 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:10.082149 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 21:13:10.086093 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.082153 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 21:13:10.086093 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.082836 2571 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 21:13:10.086580 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.085086 2571 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 21:13:10.086580 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.086302 2571 server.go:1019] "Starting client certificate rotation" Apr 20 21:13:10.086580 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.086420 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 21:13:10.086580 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.086468 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 21:13:10.117653 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.117631 2571 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 21:13:10.120158 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.120138 2571 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 21:13:10.138435 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.138415 2571 log.go:25] "Validated CRI v1 runtime API" Apr 20 21:13:10.144097 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.144080 2571 log.go:25] "Validated CRI v1 image API" Apr 20 21:13:10.145705 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.145677 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 21:13:10.146781 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.146760 2571 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 21:13:10.150880 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.150862 2571 fs.go:135] Filesystem UUIDs: map[6e1c454b-856f-493c-9b74-57baaaa34f2e:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 a216786c-68fc-47f3-bf43-f4da14c3ba24:/dev/nvme0n1p4] Apr 20 21:13:10.150932 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.150882 2571 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 21:13:10.156454 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.156243 2571 manager.go:217] Machine: {Timestamp:2026-04-20 21:13:10.154531639 +0000 UTC m=+0.397490752 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3096910 MemoryCapacity:32812167168 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2d4a75f4e81991b983d6e5fd56e559 SystemUUID:ec2d4a75-f4e8-1991-b983-d6e5fd56e559 BootID:68cf6a72-31dd-4292-8b89-9e42d9c13c4d Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:ad:e1:94:32:ad Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:ad:e1:94:32:ad Speed:0 Mtu:9001} {Name:ovs-system MacAddress:c2:7c:7c:a5:ce:25 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812167168 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 21:13:10.156454 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.156451 2571 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 21:13:10.156563 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.156524 2571 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 21:13:10.157562 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.157539 2571 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 21:13:10.157703 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.157565 2571 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-149.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 21:13:10.157748 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.157712 2571 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 21:13:10.157748 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.157724 2571 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 21:13:10.157748 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.157740 2571 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 21:13:10.158718 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.158707 2571 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 21:13:10.160123 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.160113 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 20 21:13:10.160237 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.160228 2571 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 21:13:10.162331 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.162322 2571 kubelet.go:491] "Attempting to sync node with API server" Apr 20 21:13:10.162369 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.162340 2571 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 21:13:10.162369 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.162359 2571 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 21:13:10.162441 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.162372 2571 kubelet.go:397] "Adding apiserver pod source" Apr 20 21:13:10.162441 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.162393 2571 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 21:13:10.163442 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.163428 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 21:13:10.163499 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.163454 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 21:13:10.164769 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.164748 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-2sz5k" Apr 20 21:13:10.166513 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.166490 2571 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 21:13:10.168293 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.168279 2571 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 21:13:10.169860 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.169839 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 21:13:10.169860 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.169856 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 21:13:10.169860 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.169862 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 21:13:10.169860 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.169868 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 21:13:10.170099 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.169874 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 21:13:10.170099 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.169879 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 21:13:10.170099 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.169885 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 21:13:10.170099 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.169891 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 21:13:10.170099 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.169898 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 21:13:10.170099 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.169903 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 21:13:10.170099 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.169918 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 21:13:10.170099 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.169928 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 21:13:10.170605 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.170593 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 21:13:10.170605 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.170607 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 21:13:10.170974 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:10.170952 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-149.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 21:13:10.171014 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:10.170995 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 21:13:10.172245 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.172231 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-2sz5k" Apr 20 21:13:10.174130 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.174117 2571 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 21:13:10.174209 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.174153 2571 server.go:1295] "Started kubelet" Apr 20 21:13:10.174269 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.174247 2571 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 21:13:10.174365 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.174324 2571 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 21:13:10.174436 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.174392 2571 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 21:13:10.174960 ip-10-0-129-149 systemd[1]: Started Kubernetes Kubelet. Apr 20 21:13:10.175516 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.175473 2571 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 21:13:10.175914 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.175747 2571 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-149.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 21:13:10.176731 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.176717 2571 server.go:317] "Adding debug handlers to kubelet server" Apr 20 21:13:10.184458 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.184360 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 21:13:10.184730 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.184715 2571 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 21:13:10.185382 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.185366 2571 factory.go:55] Registering systemd factory Apr 20 21:13:10.185445 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.185400 2571 factory.go:223] Registration of the systemd container factory successfully Apr 20 21:13:10.185445 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.185402 2571 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 21:13:10.185445 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.185417 2571 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 21:13:10.185565 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.185478 2571 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 21:13:10.185565 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.185519 2571 reconstruct.go:97] "Volume reconstruction finished" Apr 20 21:13:10.185565 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.185524 2571 reconciler.go:26] "Reconciler: start to sync state" Apr 20 21:13:10.185703 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.185581 2571 factory.go:153] Registering CRI-O factory Apr 20 21:13:10.185703 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.185588 2571 factory.go:223] Registration of the crio container factory successfully Apr 20 21:13:10.185703 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.185624 2571 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 21:13:10.185703 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.185642 2571 factory.go:103] Registering Raw factory Apr 20 21:13:10.185703 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.185652 2571 manager.go:1196] Started watching for new ooms in manager Apr 20 21:13:10.185888 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:10.185762 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-149.ec2.internal\" not found" Apr 20 21:13:10.186229 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.186087 2571 manager.go:319] Starting recovery of all containers Apr 20 21:13:10.188201 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.188162 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 21:13:10.188507 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:10.188476 2571 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 21:13:10.192892 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:10.192866 2571 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-129-149.ec2.internal\" not found" node="ip-10-0-129-149.ec2.internal" Apr 20 21:13:10.196281 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.196269 2571 manager.go:324] Recovery completed Apr 20 21:13:10.200445 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.200432 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 21:13:10.203414 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.203400 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-149.ec2.internal" event="NodeHasSufficientMemory" Apr 20 21:13:10.203481 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.203429 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-149.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 21:13:10.203481 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.203442 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-149.ec2.internal" event="NodeHasSufficientPID" Apr 20 21:13:10.203917 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.203902 2571 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 21:13:10.203917 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.203915 2571 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 21:13:10.204019 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.203933 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 20 21:13:10.206143 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.206131 2571 policy_none.go:49] "None policy: Start" Apr 20 21:13:10.206208 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.206146 2571 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 21:13:10.206208 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.206156 2571 state_mem.go:35] "Initializing new in-memory state store" Apr 20 21:13:10.255787 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.255773 2571 manager.go:341] "Starting Device Plugin manager" Apr 20 21:13:10.262527 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:10.255846 2571 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 21:13:10.262527 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.255860 2571 server.go:85] "Starting device plugin registration server" Apr 20 21:13:10.262527 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.256058 2571 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 21:13:10.262527 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.256073 2571 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 21:13:10.262527 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.256161 2571 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 21:13:10.262527 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.256245 2571 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 21:13:10.262527 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.256253 2571 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 21:13:10.262527 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:10.256729 2571 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 21:13:10.262527 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:10.256763 2571 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-149.ec2.internal\" not found" Apr 20 21:13:10.334854 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.334781 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 21:13:10.336142 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.336127 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 21:13:10.336243 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.336155 2571 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 21:13:10.336243 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.336197 2571 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 21:13:10.336243 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.336208 2571 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 21:13:10.336377 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:10.336248 2571 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 21:13:10.338428 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.338406 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 21:13:10.356375 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.356358 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 21:13:10.357222 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.357207 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-149.ec2.internal" event="NodeHasSufficientMemory" Apr 20 21:13:10.357293 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.357236 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-149.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 21:13:10.357293 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.357252 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-149.ec2.internal" event="NodeHasSufficientPID" Apr 20 21:13:10.357293 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.357273 2571 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-149.ec2.internal" Apr 20 21:13:10.365150 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.365134 2571 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-149.ec2.internal" Apr 20 21:13:10.365256 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:10.365160 2571 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-149.ec2.internal\": node \"ip-10-0-129-149.ec2.internal\" not found" Apr 20 21:13:10.386009 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:10.385987 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-149.ec2.internal\" not found" Apr 20 21:13:10.437201 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.437138 2571 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-149.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-149.ec2.internal"] Apr 20 21:13:10.437284 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.437246 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 21:13:10.438089 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.438075 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-149.ec2.internal" event="NodeHasSufficientMemory" Apr 20 21:13:10.438150 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.438103 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-149.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 21:13:10.438150 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.438113 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-149.ec2.internal" event="NodeHasSufficientPID" Apr 20 21:13:10.439228 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.439216 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 21:13:10.439379 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.439367 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-149.ec2.internal" Apr 20 21:13:10.439437 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.439397 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 21:13:10.439911 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.439895 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-149.ec2.internal" event="NodeHasSufficientMemory" Apr 20 21:13:10.439974 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.439922 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-149.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 21:13:10.439974 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.439936 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-149.ec2.internal" event="NodeHasSufficientPID" Apr 20 21:13:10.439974 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.439955 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-149.ec2.internal" event="NodeHasSufficientMemory" Apr 20 21:13:10.440071 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.439973 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-149.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 21:13:10.440071 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.439990 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-149.ec2.internal" event="NodeHasSufficientPID" Apr 20 21:13:10.441044 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.441031 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-149.ec2.internal" Apr 20 21:13:10.441086 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.441054 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 21:13:10.441718 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.441705 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-149.ec2.internal" event="NodeHasSufficientMemory" Apr 20 21:13:10.441782 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.441729 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-149.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 21:13:10.441782 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.441739 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-149.ec2.internal" event="NodeHasSufficientPID" Apr 20 21:13:10.461679 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:10.461657 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-149.ec2.internal\" not found" node="ip-10-0-129-149.ec2.internal" Apr 20 21:13:10.465914 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:10.465891 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-149.ec2.internal\" not found" node="ip-10-0-129-149.ec2.internal" Apr 20 21:13:10.486750 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:10.486732 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-149.ec2.internal\" not found" Apr 20 21:13:10.487883 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.487866 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bc561987305693c23e8fd4b20c60c28f-config\") pod \"kube-apiserver-proxy-ip-10-0-129-149.ec2.internal\" (UID: \"bc561987305693c23e8fd4b20c60c28f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-149.ec2.internal" Apr 20 21:13:10.487933 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.487893 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d4f43f5b0dc2bc0e90bb3cf3edee48fe-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-149.ec2.internal\" (UID: \"d4f43f5b0dc2bc0e90bb3cf3edee48fe\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-149.ec2.internal" Apr 20 21:13:10.487933 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.487911 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d4f43f5b0dc2bc0e90bb3cf3edee48fe-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-149.ec2.internal\" (UID: \"d4f43f5b0dc2bc0e90bb3cf3edee48fe\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-149.ec2.internal" Apr 20 21:13:10.587024 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:10.586967 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-149.ec2.internal\" not found" Apr 20 21:13:10.588100 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.588083 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d4f43f5b0dc2bc0e90bb3cf3edee48fe-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-149.ec2.internal\" (UID: \"d4f43f5b0dc2bc0e90bb3cf3edee48fe\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-149.ec2.internal" Apr 20 21:13:10.588148 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.588110 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d4f43f5b0dc2bc0e90bb3cf3edee48fe-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-149.ec2.internal\" (UID: \"d4f43f5b0dc2bc0e90bb3cf3edee48fe\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-149.ec2.internal" Apr 20 21:13:10.588148 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.588127 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bc561987305693c23e8fd4b20c60c28f-config\") pod \"kube-apiserver-proxy-ip-10-0-129-149.ec2.internal\" (UID: \"bc561987305693c23e8fd4b20c60c28f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-149.ec2.internal" Apr 20 21:13:10.588240 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.588200 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bc561987305693c23e8fd4b20c60c28f-config\") pod \"kube-apiserver-proxy-ip-10-0-129-149.ec2.internal\" (UID: \"bc561987305693c23e8fd4b20c60c28f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-149.ec2.internal" Apr 20 21:13:10.588240 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.588208 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d4f43f5b0dc2bc0e90bb3cf3edee48fe-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-149.ec2.internal\" (UID: \"d4f43f5b0dc2bc0e90bb3cf3edee48fe\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-149.ec2.internal" Apr 20 21:13:10.588240 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.588212 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d4f43f5b0dc2bc0e90bb3cf3edee48fe-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-149.ec2.internal\" (UID: \"d4f43f5b0dc2bc0e90bb3cf3edee48fe\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-149.ec2.internal" Apr 20 21:13:10.687653 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:10.687630 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-149.ec2.internal\" not found" Apr 20 21:13:10.763938 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.763907 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-149.ec2.internal" Apr 20 21:13:10.768479 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:10.768461 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-149.ec2.internal" Apr 20 21:13:10.788409 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:10.788381 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-149.ec2.internal\" not found" Apr 20 21:13:10.888875 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:10.888797 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-149.ec2.internal\" not found" Apr 20 21:13:10.989268 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:10.989238 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-149.ec2.internal\" not found" Apr 20 21:13:11.085534 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:11.085509 2571 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 21:13:11.086164 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:11.085637 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 21:13:11.086164 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:11.085646 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 21:13:11.089681 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:11.089660 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-149.ec2.internal\" not found" Apr 20 21:13:11.174452 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:11.174410 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 21:08:10 +0000 UTC" deadline="2027-09-18 13:09:24.698464248 +0000 UTC" Apr 20 21:13:11.174452 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:11.174443 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12375h56m13.524024091s" Apr 20 21:13:11.184867 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:11.184843 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 21:13:11.189768 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:11.189742 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-149.ec2.internal\" not found" Apr 20 21:13:11.193189 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:11.193160 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 21:13:11.216794 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:11.216763 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-nzkkm" Apr 20 21:13:11.225209 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:11.225189 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-nzkkm" Apr 20 21:13:11.249562 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:11.249527 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4f43f5b0dc2bc0e90bb3cf3edee48fe.slice/crio-ca7f3409b0c1aa8accfa7192aecff822da57998b4d32db2b1221188afbe8f1ab WatchSource:0}: Error finding container ca7f3409b0c1aa8accfa7192aecff822da57998b4d32db2b1221188afbe8f1ab: Status 404 returned error can't find the container with id ca7f3409b0c1aa8accfa7192aecff822da57998b4d32db2b1221188afbe8f1ab Apr 20 21:13:11.249830 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:11.249809 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc561987305693c23e8fd4b20c60c28f.slice/crio-65bfe824c788ed05125c76243f1eb251edc15ab2c6fa15f24815168a70ec49c9 WatchSource:0}: Error finding container 65bfe824c788ed05125c76243f1eb251edc15ab2c6fa15f24815168a70ec49c9: Status 404 returned error can't find the container with id 65bfe824c788ed05125c76243f1eb251edc15ab2c6fa15f24815168a70ec49c9 Apr 20 21:13:11.255839 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:11.255820 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 21:13:11.290710 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:11.290668 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-149.ec2.internal\" not found" Apr 20 21:13:11.339799 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:11.339756 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-149.ec2.internal" event={"ID":"d4f43f5b0dc2bc0e90bb3cf3edee48fe","Type":"ContainerStarted","Data":"ca7f3409b0c1aa8accfa7192aecff822da57998b4d32db2b1221188afbe8f1ab"} Apr 20 21:13:11.340691 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:11.340667 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-149.ec2.internal" event={"ID":"bc561987305693c23e8fd4b20c60c28f","Type":"ContainerStarted","Data":"65bfe824c788ed05125c76243f1eb251edc15ab2c6fa15f24815168a70ec49c9"} Apr 20 21:13:11.390997 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:11.390974 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-149.ec2.internal\" not found" Apr 20 21:13:11.491577 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:11.491494 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-149.ec2.internal\" not found" Apr 20 21:13:11.585391 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:11.585365 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 21:13:11.592066 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:11.592043 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-149.ec2.internal\" not found" Apr 20 21:13:11.627557 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:11.627527 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 21:13:11.685426 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:11.685393 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-149.ec2.internal" Apr 20 21:13:11.698914 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:11.698442 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 21:13:11.699477 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:11.699457 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-149.ec2.internal" Apr 20 21:13:11.707536 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:11.707516 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 21:13:12.072058 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.072026 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 21:13:12.164096 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.164064 2571 apiserver.go:52] "Watching apiserver" Apr 20 21:13:12.170489 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.170470 2571 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 21:13:12.172290 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.172261 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-6rjhj","openshift-multus/multus-8mchz","openshift-network-operator/iptables-alerter-4dhhn","openshift-ovn-kubernetes/ovnkube-node-q6skp","openshift-dns/node-resolver-9xgfk","openshift-image-registry/node-ca-fwbvz","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-149.ec2.internal","openshift-multus/multus-additional-cni-plugins-rhd4c","openshift-multus/network-metrics-daemon-fk9cw","openshift-network-diagnostics/network-check-target-cxz7h","kube-system/konnectivity-agent-nztmh","kube-system/kube-apiserver-proxy-ip-10-0-129-149.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd9tq"] Apr 20 21:13:12.174560 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.174536 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxz7h" Apr 20 21:13:12.174668 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:12.174620 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxz7h" podUID="369b8c8d-720a-4d32-a69a-64bd50a8103a" Apr 20 21:13:12.175685 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.175659 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.175768 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.175752 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4dhhn" Apr 20 21:13:12.178796 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.176895 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.178796 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.177848 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 21:13:12.178796 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.178054 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 21:13:12.178796 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.178298 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 21:13:12.178796 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.178649 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 21:13:12.179979 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.179339 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 21:13:12.179979 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.179704 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-z6z28\"" Apr 20 21:13:12.181687 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.181272 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 21:13:12.181793 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.181695 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-jfbzw\"" Apr 20 21:13:12.181848 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.181816 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 21:13:12.181960 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.181942 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 21:13:12.182499 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.182477 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 21:13:12.182499 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.182493 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9xgfk" Apr 20 21:13:12.182637 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.182498 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fwbvz" Apr 20 21:13:12.182860 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.182830 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 21:13:12.182966 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.182951 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-p7v8b\"" Apr 20 21:13:12.183136 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.183117 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 21:13:12.183228 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.183156 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 21:13:12.183339 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.183316 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 21:13:12.183674 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.183658 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rhd4c" Apr 20 21:13:12.184657 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.184641 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-s7f2k\"" Apr 20 21:13:12.184657 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.184650 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 21:13:12.184852 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.184837 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 21:13:12.184937 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.184922 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 21:13:12.185087 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.185072 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fk9cw" Apr 20 21:13:12.185153 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:12.185131 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fk9cw" podUID="89e3c54c-a866-4c9b-940d-54a417b5c964" Apr 20 21:13:12.185241 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.185150 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 21:13:12.185395 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.185377 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-7wb2n\"" Apr 20 21:13:12.185462 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.185393 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 21:13:12.185752 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.185735 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 21:13:12.185915 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.185899 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 21:13:12.185976 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.185905 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-wnhr7\"" Apr 20 21:13:12.186736 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.186715 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.186849 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.186779 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-nztmh" Apr 20 21:13:12.188537 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.188495 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd9tq" Apr 20 21:13:12.188629 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.188549 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-k9nhc\"" Apr 20 21:13:12.188681 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.188652 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 21:13:12.188681 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.188669 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-s9cl2\"" Apr 20 21:13:12.188811 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.188787 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 21:13:12.188897 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.188887 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 21:13:12.189130 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.189113 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 21:13:12.190689 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.190669 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-72zks\"" Apr 20 21:13:12.190779 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.190720 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 21:13:12.191043 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.191024 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 21:13:12.191128 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.191074 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 21:13:12.196224 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.196200 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-var-lib-openvswitch\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.196307 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.196232 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-host-cni-netd\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.196307 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.196255 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-system-cni-dir\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.196307 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.196279 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-host-var-lib-cni-multus\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.196307 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.196305 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1931e219-0173-47c5-a78a-7401b997716c-host-slash\") pod \"iptables-alerter-4dhhn\" (UID: \"1931e219-0173-47c5-a78a-7401b997716c\") " pod="openshift-network-operator/iptables-alerter-4dhhn" Apr 20 21:13:12.196517 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.196348 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rhd4c\" (UID: \"6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9\") " pod="openshift-multus/multus-additional-cni-plugins-rhd4c" Apr 20 21:13:12.196517 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.196394 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-multus-cni-dir\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.196517 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.196425 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-run-openvswitch\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.196517 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.196452 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh86z\" (UniqueName: \"kubernetes.io/projected/89e3c54c-a866-4c9b-940d-54a417b5c964-kube-api-access-nh86z\") pod \"network-metrics-daemon-fk9cw\" (UID: \"89e3c54c-a866-4c9b-940d-54a417b5c964\") " pod="openshift-multus/network-metrics-daemon-fk9cw" Apr 20 21:13:12.196517 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.196477 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ee2db464-93af-49a5-a30a-e8119e1eac63-etc-sysctl-conf\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.196517 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.196502 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ee2db464-93af-49a5-a30a-e8119e1eac63-sys\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.196790 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.196528 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cf801c74-93a7-4e27-ba8a-0c31596e95c6-multus-daemon-config\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.196790 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.196554 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9-system-cni-dir\") pod \"multus-additional-cni-plugins-rhd4c\" (UID: \"6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9\") " pod="openshift-multus/multus-additional-cni-plugins-rhd4c" Apr 20 21:13:12.196790 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.196588 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rhd4c\" (UID: \"6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9\") " pod="openshift-multus/multus-additional-cni-plugins-rhd4c" Apr 20 21:13:12.196790 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.196617 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-node-log\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.196790 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.196641 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-host-run-ovn-kubernetes\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.196790 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.196664 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/47953ca1-cc2f-4035-8d59-26be8c7a9516-tmp-dir\") pod \"node-resolver-9xgfk\" (UID: \"47953ca1-cc2f-4035-8d59-26be8c7a9516\") " pod="openshift-dns/node-resolver-9xgfk" Apr 20 21:13:12.196790 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.196695 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ee2db464-93af-49a5-a30a-e8119e1eac63-var-lib-kubelet\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.196790 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.196720 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/86a41be0-f642-425a-a950-24cf589ab648-konnectivity-ca\") pod \"konnectivity-agent-nztmh\" (UID: \"86a41be0-f642-425a-a950-24cf589ab648\") " pod="kube-system/konnectivity-agent-nztmh" Apr 20 21:13:12.196790 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.196739 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-multus-conf-dir\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.196790 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.196761 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rhd4c\" (UID: \"6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9\") " pod="openshift-multus/multus-additional-cni-plugins-rhd4c" Apr 20 21:13:12.196790 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.196777 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-host-run-netns\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.197268 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.196801 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-host-var-lib-kubelet\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.197268 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.196819 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-hostroot\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.197268 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.196853 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/95826e15-25fd-44ed-bc3e-c54baaa50bb7-serviceca\") pod \"node-ca-fwbvz\" (UID: \"95826e15-25fd-44ed-bc3e-c54baaa50bb7\") " pod="openshift-image-registry/node-ca-fwbvz" Apr 20 21:13:12.197268 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.196889 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/82c75868-1659-4814-b726-ba733f5f2ebc-ovnkube-script-lib\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.197268 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.196941 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stvbx\" (UniqueName: \"kubernetes.io/projected/369b8c8d-720a-4d32-a69a-64bd50a8103a-kube-api-access-stvbx\") pod \"network-check-target-cxz7h\" (UID: \"369b8c8d-720a-4d32-a69a-64bd50a8103a\") " pod="openshift-network-diagnostics/network-check-target-cxz7h" Apr 20 21:13:12.197268 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.196996 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.197268 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.197030 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/95826e15-25fd-44ed-bc3e-c54baaa50bb7-host\") pod \"node-ca-fwbvz\" (UID: \"95826e15-25fd-44ed-bc3e-c54baaa50bb7\") " pod="openshift-image-registry/node-ca-fwbvz" Apr 20 21:13:12.197268 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.197061 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-run-systemd\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.197268 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.197087 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-run-ovn\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.197268 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.197130 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldmtl\" (UniqueName: \"kubernetes.io/projected/82c75868-1659-4814-b726-ba733f5f2ebc-kube-api-access-ldmtl\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.197268 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.197200 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/82c75868-1659-4814-b726-ba733f5f2ebc-ovn-node-metrics-cert\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.197268 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.197230 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ee2db464-93af-49a5-a30a-e8119e1eac63-lib-modules\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.197268 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.197265 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ht4q\" (UniqueName: \"kubernetes.io/projected/95826e15-25fd-44ed-bc3e-c54baaa50bb7-kube-api-access-4ht4q\") pod \"node-ca-fwbvz\" (UID: \"95826e15-25fd-44ed-bc3e-c54baaa50bb7\") " pod="openshift-image-registry/node-ca-fwbvz" Apr 20 21:13:12.197774 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.197290 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-etc-openvswitch\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.197774 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.197314 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-host-run-netns\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.197774 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.197353 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-etc-kubernetes\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.197774 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.197387 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cf801c74-93a7-4e27-ba8a-0c31596e95c6-cni-binary-copy\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.197774 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.197410 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89e3c54c-a866-4c9b-940d-54a417b5c964-metrics-certs\") pod \"network-metrics-daemon-fk9cw\" (UID: \"89e3c54c-a866-4c9b-940d-54a417b5c964\") " pod="openshift-multus/network-metrics-daemon-fk9cw" Apr 20 21:13:12.197774 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.197434 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ee2db464-93af-49a5-a30a-e8119e1eac63-etc-sysconfig\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.197774 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.197461 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ee2db464-93af-49a5-a30a-e8119e1eac63-etc-systemd\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.197774 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.197490 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/86a41be0-f642-425a-a950-24cf589ab648-agent-certs\") pod \"konnectivity-agent-nztmh\" (UID: \"86a41be0-f642-425a-a950-24cf589ab648\") " pod="kube-system/konnectivity-agent-nztmh" Apr 20 21:13:12.197774 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.197531 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/82c75868-1659-4814-b726-ba733f5f2ebc-env-overrides\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.197774 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.197558 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ee2db464-93af-49a5-a30a-e8119e1eac63-tmp\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.197774 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.197585 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6rf6\" (UniqueName: \"kubernetes.io/projected/ee2db464-93af-49a5-a30a-e8119e1eac63-kube-api-access-m6rf6\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.197774 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.197615 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-host-slash\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.197774 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.197648 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-cnibin\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.197774 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.197671 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ee2db464-93af-49a5-a30a-e8119e1eac63-run\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.197774 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.197709 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ee2db464-93af-49a5-a30a-e8119e1eac63-host\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.197774 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.197740 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw4xh\" (UniqueName: \"kubernetes.io/projected/1931e219-0173-47c5-a78a-7401b997716c-kube-api-access-pw4xh\") pod \"iptables-alerter-4dhhn\" (UID: \"1931e219-0173-47c5-a78a-7401b997716c\") " pod="openshift-network-operator/iptables-alerter-4dhhn" Apr 20 21:13:12.197774 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.197765 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9-cni-binary-copy\") pod \"multus-additional-cni-plugins-rhd4c\" (UID: \"6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9\") " pod="openshift-multus/multus-additional-cni-plugins-rhd4c" Apr 20 21:13:12.198443 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.197789 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-log-socket\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.198443 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.197813 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-host-cni-bin\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.198443 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.197837 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/47953ca1-cc2f-4035-8d59-26be8c7a9516-hosts-file\") pod \"node-resolver-9xgfk\" (UID: \"47953ca1-cc2f-4035-8d59-26be8c7a9516\") " pod="openshift-dns/node-resolver-9xgfk" Apr 20 21:13:12.198443 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.197873 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrl4h\" (UniqueName: \"kubernetes.io/projected/47953ca1-cc2f-4035-8d59-26be8c7a9516-kube-api-access-vrl4h\") pod \"node-resolver-9xgfk\" (UID: \"47953ca1-cc2f-4035-8d59-26be8c7a9516\") " pod="openshift-dns/node-resolver-9xgfk" Apr 20 21:13:12.198443 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.197917 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ee2db464-93af-49a5-a30a-e8119e1eac63-etc-modprobe-d\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.198443 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.197943 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee2db464-93af-49a5-a30a-e8119e1eac63-etc-kubernetes\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.198443 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.197967 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-os-release\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.198443 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.198012 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-multus-socket-dir-parent\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.198443 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.198043 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-host-kubelet\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.198443 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.198067 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ee2db464-93af-49a5-a30a-e8119e1eac63-etc-tuned\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.198443 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.198091 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-host-run-k8s-cni-cncf-io\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.198443 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.198115 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-host-var-lib-cni-bin\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.198443 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.198137 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-host-run-multus-certs\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.198443 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.198163 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1931e219-0173-47c5-a78a-7401b997716c-iptables-alerter-script\") pod \"iptables-alerter-4dhhn\" (UID: \"1931e219-0173-47c5-a78a-7401b997716c\") " pod="openshift-network-operator/iptables-alerter-4dhhn" Apr 20 21:13:12.198443 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.198203 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9-cnibin\") pod \"multus-additional-cni-plugins-rhd4c\" (UID: \"6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9\") " pod="openshift-multus/multus-additional-cni-plugins-rhd4c" Apr 20 21:13:12.198443 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.198227 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9-os-release\") pod \"multus-additional-cni-plugins-rhd4c\" (UID: \"6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9\") " pod="openshift-multus/multus-additional-cni-plugins-rhd4c" Apr 20 21:13:12.198988 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.198250 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-systemd-units\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.198988 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.198275 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/82c75868-1659-4814-b726-ba733f5f2ebc-ovnkube-config\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.198988 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.198310 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ee2db464-93af-49a5-a30a-e8119e1eac63-etc-sysctl-d\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.198988 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.198334 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x6rk\" (UniqueName: \"kubernetes.io/projected/cf801c74-93a7-4e27-ba8a-0c31596e95c6-kube-api-access-2x6rk\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.198988 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.198357 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29bbs\" (UniqueName: \"kubernetes.io/projected/6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9-kube-api-access-29bbs\") pod \"multus-additional-cni-plugins-rhd4c\" (UID: \"6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9\") " pod="openshift-multus/multus-additional-cni-plugins-rhd4c" Apr 20 21:13:12.225918 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.225891 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 21:08:11 +0000 UTC" deadline="2027-12-10 18:59:37.041124974 +0000 UTC" Apr 20 21:13:12.225918 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.225915 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14373h46m24.815212397s" Apr 20 21:13:12.286229 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.286204 2571 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 21:13:12.298728 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.298699 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/82c75868-1659-4814-b726-ba733f5f2ebc-ovn-node-metrics-cert\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.298728 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.298732 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ee2db464-93af-49a5-a30a-e8119e1eac63-lib-modules\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.298929 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.298749 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4ht4q\" (UniqueName: \"kubernetes.io/projected/95826e15-25fd-44ed-bc3e-c54baaa50bb7-kube-api-access-4ht4q\") pod \"node-ca-fwbvz\" (UID: \"95826e15-25fd-44ed-bc3e-c54baaa50bb7\") " pod="openshift-image-registry/node-ca-fwbvz" Apr 20 21:13:12.298929 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.298878 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ee2db464-93af-49a5-a30a-e8119e1eac63-lib-modules\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.299025 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.298929 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-etc-openvswitch\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.299025 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.298959 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-host-run-netns\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.299025 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.298988 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-etc-kubernetes\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.299025 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.298988 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-etc-openvswitch\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.299025 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.299012 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cf801c74-93a7-4e27-ba8a-0c31596e95c6-cni-binary-copy\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.299307 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.299031 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-host-run-netns\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.299307 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.299038 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89e3c54c-a866-4c9b-940d-54a417b5c964-metrics-certs\") pod \"network-metrics-daemon-fk9cw\" (UID: \"89e3c54c-a866-4c9b-940d-54a417b5c964\") " pod="openshift-multus/network-metrics-daemon-fk9cw" Apr 20 21:13:12.299307 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.299076 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-etc-kubernetes\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.299307 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:12.299126 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:13:12.299307 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.299124 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ee2db464-93af-49a5-a30a-e8119e1eac63-etc-sysconfig\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.299307 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.299136 2571 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 21:13:12.299307 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.299166 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ee2db464-93af-49a5-a30a-e8119e1eac63-etc-systemd\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.299307 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:12.299224 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89e3c54c-a866-4c9b-940d-54a417b5c964-metrics-certs podName:89e3c54c-a866-4c9b-940d-54a417b5c964 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:12.799200835 +0000 UTC m=+3.042159952 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/89e3c54c-a866-4c9b-940d-54a417b5c964-metrics-certs") pod "network-metrics-daemon-fk9cw" (UID: "89e3c54c-a866-4c9b-940d-54a417b5c964") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:13:12.299307 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.299245 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/86a41be0-f642-425a-a950-24cf589ab648-agent-certs\") pod \"konnectivity-agent-nztmh\" (UID: \"86a41be0-f642-425a-a950-24cf589ab648\") " pod="kube-system/konnectivity-agent-nztmh" Apr 20 21:13:12.299307 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.299253 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ee2db464-93af-49a5-a30a-e8119e1eac63-etc-systemd\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.299307 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.299266 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ee2db464-93af-49a5-a30a-e8119e1eac63-etc-sysconfig\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.299307 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.299273 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/82c75868-1659-4814-b726-ba733f5f2ebc-env-overrides\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.299307 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.299311 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ee2db464-93af-49a5-a30a-e8119e1eac63-tmp\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.299849 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.299335 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6rf6\" (UniqueName: \"kubernetes.io/projected/ee2db464-93af-49a5-a30a-e8119e1eac63-kube-api-access-m6rf6\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.299849 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.299363 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-host-slash\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.299849 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.299389 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-cnibin\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.299849 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.299413 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ee2db464-93af-49a5-a30a-e8119e1eac63-run\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.299849 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.299437 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ee2db464-93af-49a5-a30a-e8119e1eac63-host\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.299849 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.299470 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pw4xh\" (UniqueName: \"kubernetes.io/projected/1931e219-0173-47c5-a78a-7401b997716c-kube-api-access-pw4xh\") pod \"iptables-alerter-4dhhn\" (UID: \"1931e219-0173-47c5-a78a-7401b997716c\") " pod="openshift-network-operator/iptables-alerter-4dhhn" Apr 20 21:13:12.299849 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.299498 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9-cni-binary-copy\") pod \"multus-additional-cni-plugins-rhd4c\" (UID: \"6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9\") " pod="openshift-multus/multus-additional-cni-plugins-rhd4c" Apr 20 21:13:12.299849 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.299525 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/061de7ab-84ea-4a69-b0d8-2c8f251826a8-socket-dir\") pod \"aws-ebs-csi-driver-node-vd9tq\" (UID: \"061de7ab-84ea-4a69-b0d8-2c8f251826a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd9tq" Apr 20 21:13:12.299849 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.299516 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-cnibin\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.299849 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.299550 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-log-socket\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.299849 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.299591 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-host-cni-bin\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.299849 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.299616 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/47953ca1-cc2f-4035-8d59-26be8c7a9516-hosts-file\") pod \"node-resolver-9xgfk\" (UID: \"47953ca1-cc2f-4035-8d59-26be8c7a9516\") " pod="openshift-dns/node-resolver-9xgfk" Apr 20 21:13:12.299849 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.299658 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cf801c74-93a7-4e27-ba8a-0c31596e95c6-cni-binary-copy\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.299849 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.299665 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/82c75868-1659-4814-b726-ba733f5f2ebc-env-overrides\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.299849 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.299663 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vrl4h\" (UniqueName: \"kubernetes.io/projected/47953ca1-cc2f-4035-8d59-26be8c7a9516-kube-api-access-vrl4h\") pod \"node-resolver-9xgfk\" (UID: \"47953ca1-cc2f-4035-8d59-26be8c7a9516\") " pod="openshift-dns/node-resolver-9xgfk" Apr 20 21:13:12.299849 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.299763 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ee2db464-93af-49a5-a30a-e8119e1eac63-etc-modprobe-d\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.299849 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.299857 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ee2db464-93af-49a5-a30a-e8119e1eac63-run\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.300986 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.299904 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ee2db464-93af-49a5-a30a-e8119e1eac63-host\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.300986 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.299934 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/47953ca1-cc2f-4035-8d59-26be8c7a9516-hosts-file\") pod \"node-resolver-9xgfk\" (UID: \"47953ca1-cc2f-4035-8d59-26be8c7a9516\") " pod="openshift-dns/node-resolver-9xgfk" Apr 20 21:13:12.300986 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.299979 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-log-socket\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.300986 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.300018 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-host-cni-bin\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.300986 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.300097 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ee2db464-93af-49a5-a30a-e8119e1eac63-etc-modprobe-d\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.300986 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.299800 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-host-slash\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.300986 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.300199 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee2db464-93af-49a5-a30a-e8119e1eac63-etc-kubernetes\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.300986 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.300264 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-os-release\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.300986 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.300309 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee2db464-93af-49a5-a30a-e8119e1eac63-etc-kubernetes\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.300986 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.300372 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-os-release\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.300986 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.300402 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-multus-socket-dir-parent\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.300986 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.300429 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-host-kubelet\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.300986 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.300451 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ee2db464-93af-49a5-a30a-e8119e1eac63-etc-tuned\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.300986 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.300482 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-host-run-k8s-cni-cncf-io\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.300986 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.300497 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-multus-socket-dir-parent\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.300986 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.300541 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-host-var-lib-cni-bin\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.300986 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.300578 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-host-kubelet\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.300986 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.300504 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-host-var-lib-cni-bin\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.301793 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.300687 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-host-run-multus-certs\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.301793 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.300716 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1931e219-0173-47c5-a78a-7401b997716c-iptables-alerter-script\") pod \"iptables-alerter-4dhhn\" (UID: \"1931e219-0173-47c5-a78a-7401b997716c\") " pod="openshift-network-operator/iptables-alerter-4dhhn" Apr 20 21:13:12.301793 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.300743 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9-cnibin\") pod \"multus-additional-cni-plugins-rhd4c\" (UID: \"6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9\") " pod="openshift-multus/multus-additional-cni-plugins-rhd4c" Apr 20 21:13:12.301793 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.300792 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9-os-release\") pod \"multus-additional-cni-plugins-rhd4c\" (UID: \"6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9\") " pod="openshift-multus/multus-additional-cni-plugins-rhd4c" Apr 20 21:13:12.301793 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.300813 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-host-run-k8s-cni-cncf-io\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.301793 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.300823 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9-cnibin\") pod \"multus-additional-cni-plugins-rhd4c\" (UID: \"6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9\") " pod="openshift-multus/multus-additional-cni-plugins-rhd4c" Apr 20 21:13:12.301793 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.300836 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-systemd-units\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.301793 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.300862 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/82c75868-1659-4814-b726-ba733f5f2ebc-ovnkube-config\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.301793 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.300864 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-host-run-multus-certs\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.301793 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.300887 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ee2db464-93af-49a5-a30a-e8119e1eac63-etc-sysctl-d\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.301793 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.300887 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9-os-release\") pod \"multus-additional-cni-plugins-rhd4c\" (UID: \"6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9\") " pod="openshift-multus/multus-additional-cni-plugins-rhd4c" Apr 20 21:13:12.301793 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.300904 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-systemd-units\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.301793 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.300912 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2x6rk\" (UniqueName: \"kubernetes.io/projected/cf801c74-93a7-4e27-ba8a-0c31596e95c6-kube-api-access-2x6rk\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.301793 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.300939 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-29bbs\" (UniqueName: \"kubernetes.io/projected/6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9-kube-api-access-29bbs\") pod \"multus-additional-cni-plugins-rhd4c\" (UID: \"6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9\") " pod="openshift-multus/multus-additional-cni-plugins-rhd4c" Apr 20 21:13:12.301793 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.300970 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/061de7ab-84ea-4a69-b0d8-2c8f251826a8-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vd9tq\" (UID: \"061de7ab-84ea-4a69-b0d8-2c8f251826a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd9tq" Apr 20 21:13:12.301793 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.300999 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/061de7ab-84ea-4a69-b0d8-2c8f251826a8-etc-selinux\") pod \"aws-ebs-csi-driver-node-vd9tq\" (UID: \"061de7ab-84ea-4a69-b0d8-2c8f251826a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd9tq" Apr 20 21:13:12.301793 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.301020 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ee2db464-93af-49a5-a30a-e8119e1eac63-etc-sysctl-d\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.302438 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.301028 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-var-lib-openvswitch\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.302438 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.301055 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-host-cni-netd\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.302438 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.301082 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-system-cni-dir\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.302438 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.301129 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-host-var-lib-cni-multus\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.302438 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.301154 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1931e219-0173-47c5-a78a-7401b997716c-host-slash\") pod \"iptables-alerter-4dhhn\" (UID: \"1931e219-0173-47c5-a78a-7401b997716c\") " pod="openshift-network-operator/iptables-alerter-4dhhn" Apr 20 21:13:12.302438 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.301214 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rhd4c\" (UID: \"6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9\") " pod="openshift-multus/multus-additional-cni-plugins-rhd4c" Apr 20 21:13:12.302438 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.301260 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-multus-cni-dir\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.302438 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.301285 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-run-openvswitch\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.302438 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.301296 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1931e219-0173-47c5-a78a-7401b997716c-iptables-alerter-script\") pod \"iptables-alerter-4dhhn\" (UID: \"1931e219-0173-47c5-a78a-7401b997716c\") " pod="openshift-network-operator/iptables-alerter-4dhhn" Apr 20 21:13:12.302438 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.301325 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nh86z\" (UniqueName: \"kubernetes.io/projected/89e3c54c-a866-4c9b-940d-54a417b5c964-kube-api-access-nh86z\") pod \"network-metrics-daemon-fk9cw\" (UID: \"89e3c54c-a866-4c9b-940d-54a417b5c964\") " pod="openshift-multus/network-metrics-daemon-fk9cw" Apr 20 21:13:12.302438 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.301346 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/82c75868-1659-4814-b726-ba733f5f2ebc-ovnkube-config\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.302438 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.301351 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ee2db464-93af-49a5-a30a-e8119e1eac63-etc-sysctl-conf\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.302438 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.301378 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-system-cni-dir\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.302438 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.301404 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ee2db464-93af-49a5-a30a-e8119e1eac63-sys\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.302438 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.301417 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9-cni-binary-copy\") pod \"multus-additional-cni-plugins-rhd4c\" (UID: \"6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9\") " pod="openshift-multus/multus-additional-cni-plugins-rhd4c" Apr 20 21:13:12.302438 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.301433 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cf801c74-93a7-4e27-ba8a-0c31596e95c6-multus-daemon-config\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.302438 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.301441 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-var-lib-openvswitch\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.303049 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.301461 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9-system-cni-dir\") pod \"multus-additional-cni-plugins-rhd4c\" (UID: \"6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9\") " pod="openshift-multus/multus-additional-cni-plugins-rhd4c" Apr 20 21:13:12.303049 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.301466 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ee2db464-93af-49a5-a30a-e8119e1eac63-etc-sysctl-conf\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.303049 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.301478 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ee2db464-93af-49a5-a30a-e8119e1eac63-sys\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.303049 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.301488 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rhd4c\" (UID: \"6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9\") " pod="openshift-multus/multus-additional-cni-plugins-rhd4c" Apr 20 21:13:12.303049 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.301514 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-node-log\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.303049 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.301539 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-host-run-ovn-kubernetes\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.303049 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.301540 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-host-var-lib-cni-multus\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.303049 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.301578 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1931e219-0173-47c5-a78a-7401b997716c-host-slash\") pod \"iptables-alerter-4dhhn\" (UID: \"1931e219-0173-47c5-a78a-7401b997716c\") " pod="openshift-network-operator/iptables-alerter-4dhhn" Apr 20 21:13:12.303049 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.301591 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9-system-cni-dir\") pod \"multus-additional-cni-plugins-rhd4c\" (UID: \"6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9\") " pod="openshift-multus/multus-additional-cni-plugins-rhd4c" Apr 20 21:13:12.303049 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.301610 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/47953ca1-cc2f-4035-8d59-26be8c7a9516-tmp-dir\") pod \"node-resolver-9xgfk\" (UID: \"47953ca1-cc2f-4035-8d59-26be8c7a9516\") " pod="openshift-dns/node-resolver-9xgfk" Apr 20 21:13:12.303049 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.301653 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-host-cni-netd\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.303049 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.301694 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-node-log\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.303049 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.301731 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-host-run-ovn-kubernetes\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.303049 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.301771 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-run-openvswitch\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.303049 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.301776 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-multus-cni-dir\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.303049 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.301890 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/47953ca1-cc2f-4035-8d59-26be8c7a9516-tmp-dir\") pod \"node-resolver-9xgfk\" (UID: \"47953ca1-cc2f-4035-8d59-26be8c7a9516\") " pod="openshift-dns/node-resolver-9xgfk" Apr 20 21:13:12.303049 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.301963 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ee2db464-93af-49a5-a30a-e8119e1eac63-var-lib-kubelet\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.303600 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.302028 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rhd4c\" (UID: \"6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9\") " pod="openshift-multus/multus-additional-cni-plugins-rhd4c" Apr 20 21:13:12.303600 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.302032 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/86a41be0-f642-425a-a950-24cf589ab648-konnectivity-ca\") pod \"konnectivity-agent-nztmh\" (UID: \"86a41be0-f642-425a-a950-24cf589ab648\") " pod="kube-system/konnectivity-agent-nztmh" Apr 20 21:13:12.303600 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.302075 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ee2db464-93af-49a5-a30a-e8119e1eac63-var-lib-kubelet\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.303600 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.302412 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cf801c74-93a7-4e27-ba8a-0c31596e95c6-multus-daemon-config\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.303600 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.302457 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-multus-conf-dir\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.303600 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.302497 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rhd4c\" (UID: \"6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9\") " pod="openshift-multus/multus-additional-cni-plugins-rhd4c" Apr 20 21:13:12.303600 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.302524 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/061de7ab-84ea-4a69-b0d8-2c8f251826a8-device-dir\") pod \"aws-ebs-csi-driver-node-vd9tq\" (UID: \"061de7ab-84ea-4a69-b0d8-2c8f251826a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd9tq" Apr 20 21:13:12.303600 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.302746 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-host-run-netns\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.303600 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.302565 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/86a41be0-f642-425a-a950-24cf589ab648-konnectivity-ca\") pod \"konnectivity-agent-nztmh\" (UID: \"86a41be0-f642-425a-a950-24cf589ab648\") " pod="kube-system/konnectivity-agent-nztmh" Apr 20 21:13:12.303600 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.302774 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-host-var-lib-kubelet\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.303600 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.302612 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/82c75868-1659-4814-b726-ba733f5f2ebc-ovn-node-metrics-cert\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.303600 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.302801 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-hostroot\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.303600 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.302601 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ee2db464-93af-49a5-a30a-e8119e1eac63-tmp\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.303600 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.302689 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rhd4c\" (UID: \"6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9\") " pod="openshift-multus/multus-additional-cni-plugins-rhd4c" Apr 20 21:13:12.303600 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.302843 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/95826e15-25fd-44ed-bc3e-c54baaa50bb7-serviceca\") pod \"node-ca-fwbvz\" (UID: \"95826e15-25fd-44ed-bc3e-c54baaa50bb7\") " pod="openshift-image-registry/node-ca-fwbvz" Apr 20 21:13:12.303600 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.302856 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-host-var-lib-kubelet\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.303600 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.302860 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rhd4c\" (UID: \"6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9\") " pod="openshift-multus/multus-additional-cni-plugins-rhd4c" Apr 20 21:13:12.304160 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.302622 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-multus-conf-dir\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.304160 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.302877 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/061de7ab-84ea-4a69-b0d8-2c8f251826a8-sys-fs\") pod \"aws-ebs-csi-driver-node-vd9tq\" (UID: \"061de7ab-84ea-4a69-b0d8-2c8f251826a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd9tq" Apr 20 21:13:12.304160 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.302905 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpzv8\" (UniqueName: \"kubernetes.io/projected/061de7ab-84ea-4a69-b0d8-2c8f251826a8-kube-api-access-vpzv8\") pod \"aws-ebs-csi-driver-node-vd9tq\" (UID: \"061de7ab-84ea-4a69-b0d8-2c8f251826a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd9tq" Apr 20 21:13:12.304160 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.302942 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-host-run-netns\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.304160 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.302954 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/82c75868-1659-4814-b726-ba733f5f2ebc-ovnkube-script-lib\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.304160 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.302908 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cf801c74-93a7-4e27-ba8a-0c31596e95c6-hostroot\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.304160 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.302983 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stvbx\" (UniqueName: \"kubernetes.io/projected/369b8c8d-720a-4d32-a69a-64bd50a8103a-kube-api-access-stvbx\") pod \"network-check-target-cxz7h\" (UID: \"369b8c8d-720a-4d32-a69a-64bd50a8103a\") " pod="openshift-network-diagnostics/network-check-target-cxz7h" Apr 20 21:13:12.304160 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.303011 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.304160 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.303035 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/95826e15-25fd-44ed-bc3e-c54baaa50bb7-host\") pod \"node-ca-fwbvz\" (UID: \"95826e15-25fd-44ed-bc3e-c54baaa50bb7\") " pod="openshift-image-registry/node-ca-fwbvz" Apr 20 21:13:12.304160 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.303059 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-run-systemd\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.304160 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.303082 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-run-ovn\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.304160 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.303105 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldmtl\" (UniqueName: \"kubernetes.io/projected/82c75868-1659-4814-b726-ba733f5f2ebc-kube-api-access-ldmtl\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.304160 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.303132 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/061de7ab-84ea-4a69-b0d8-2c8f251826a8-registration-dir\") pod \"aws-ebs-csi-driver-node-vd9tq\" (UID: \"061de7ab-84ea-4a69-b0d8-2c8f251826a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd9tq" Apr 20 21:13:12.304160 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.303302 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/95826e15-25fd-44ed-bc3e-c54baaa50bb7-serviceca\") pod \"node-ca-fwbvz\" (UID: \"95826e15-25fd-44ed-bc3e-c54baaa50bb7\") " pod="openshift-image-registry/node-ca-fwbvz" Apr 20 21:13:12.304160 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.303356 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-run-systemd\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.304160 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.303386 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-run-ovn\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.304160 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.303397 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/95826e15-25fd-44ed-bc3e-c54baaa50bb7-host\") pod \"node-ca-fwbvz\" (UID: \"95826e15-25fd-44ed-bc3e-c54baaa50bb7\") " pod="openshift-image-registry/node-ca-fwbvz" Apr 20 21:13:12.304782 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.303394 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82c75868-1659-4814-b726-ba733f5f2ebc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.304782 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.303464 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/82c75868-1659-4814-b726-ba733f5f2ebc-ovnkube-script-lib\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.304782 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.304012 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ee2db464-93af-49a5-a30a-e8119e1eac63-etc-tuned\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.304782 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.304519 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/86a41be0-f642-425a-a950-24cf589ab648-agent-certs\") pod \"konnectivity-agent-nztmh\" (UID: \"86a41be0-f642-425a-a950-24cf589ab648\") " pod="kube-system/konnectivity-agent-nztmh" Apr 20 21:13:12.309973 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:12.309949 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 21:13:12.309973 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:12.309976 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 21:13:12.310124 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:12.309989 2571 projected.go:194] Error preparing data for projected volume kube-api-access-stvbx for pod openshift-network-diagnostics/network-check-target-cxz7h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:13:12.310253 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:12.310238 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/369b8c8d-720a-4d32-a69a-64bd50a8103a-kube-api-access-stvbx podName:369b8c8d-720a-4d32-a69a-64bd50a8103a nodeName:}" failed. No retries permitted until 2026-04-20 21:13:12.810139393 +0000 UTC m=+3.053098502 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-stvbx" (UniqueName: "kubernetes.io/projected/369b8c8d-720a-4d32-a69a-64bd50a8103a-kube-api-access-stvbx") pod "network-check-target-cxz7h" (UID: "369b8c8d-720a-4d32-a69a-64bd50a8103a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:13:12.312107 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.312079 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh86z\" (UniqueName: \"kubernetes.io/projected/89e3c54c-a866-4c9b-940d-54a417b5c964-kube-api-access-nh86z\") pod \"network-metrics-daemon-fk9cw\" (UID: \"89e3c54c-a866-4c9b-940d-54a417b5c964\") " pod="openshift-multus/network-metrics-daemon-fk9cw" Apr 20 21:13:12.312252 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.312231 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw4xh\" (UniqueName: \"kubernetes.io/projected/1931e219-0173-47c5-a78a-7401b997716c-kube-api-access-pw4xh\") pod \"iptables-alerter-4dhhn\" (UID: \"1931e219-0173-47c5-a78a-7401b997716c\") " pod="openshift-network-operator/iptables-alerter-4dhhn" Apr 20 21:13:12.312569 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.312553 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6rf6\" (UniqueName: \"kubernetes.io/projected/ee2db464-93af-49a5-a30a-e8119e1eac63-kube-api-access-m6rf6\") pod \"tuned-6rjhj\" (UID: \"ee2db464-93af-49a5-a30a-e8119e1eac63\") " pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.313571 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.313551 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-29bbs\" (UniqueName: \"kubernetes.io/projected/6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9-kube-api-access-29bbs\") pod \"multus-additional-cni-plugins-rhd4c\" (UID: \"6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9\") " pod="openshift-multus/multus-additional-cni-plugins-rhd4c" Apr 20 21:13:12.313719 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.313698 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ht4q\" (UniqueName: \"kubernetes.io/projected/95826e15-25fd-44ed-bc3e-c54baaa50bb7-kube-api-access-4ht4q\") pod \"node-ca-fwbvz\" (UID: \"95826e15-25fd-44ed-bc3e-c54baaa50bb7\") " pod="openshift-image-registry/node-ca-fwbvz" Apr 20 21:13:12.314171 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.314148 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldmtl\" (UniqueName: \"kubernetes.io/projected/82c75868-1659-4814-b726-ba733f5f2ebc-kube-api-access-ldmtl\") pod \"ovnkube-node-q6skp\" (UID: \"82c75868-1659-4814-b726-ba733f5f2ebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.314351 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.314335 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrl4h\" (UniqueName: \"kubernetes.io/projected/47953ca1-cc2f-4035-8d59-26be8c7a9516-kube-api-access-vrl4h\") pod \"node-resolver-9xgfk\" (UID: \"47953ca1-cc2f-4035-8d59-26be8c7a9516\") " pod="openshift-dns/node-resolver-9xgfk" Apr 20 21:13:12.314539 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.314523 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x6rk\" (UniqueName: \"kubernetes.io/projected/cf801c74-93a7-4e27-ba8a-0c31596e95c6-kube-api-access-2x6rk\") pod \"multus-8mchz\" (UID: \"cf801c74-93a7-4e27-ba8a-0c31596e95c6\") " pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.403994 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.403910 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/061de7ab-84ea-4a69-b0d8-2c8f251826a8-registration-dir\") pod \"aws-ebs-csi-driver-node-vd9tq\" (UID: \"061de7ab-84ea-4a69-b0d8-2c8f251826a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd9tq" Apr 20 21:13:12.403994 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.403962 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/061de7ab-84ea-4a69-b0d8-2c8f251826a8-socket-dir\") pod \"aws-ebs-csi-driver-node-vd9tq\" (UID: \"061de7ab-84ea-4a69-b0d8-2c8f251826a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd9tq" Apr 20 21:13:12.404164 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.403997 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/061de7ab-84ea-4a69-b0d8-2c8f251826a8-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vd9tq\" (UID: \"061de7ab-84ea-4a69-b0d8-2c8f251826a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd9tq" Apr 20 21:13:12.404164 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.404020 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/061de7ab-84ea-4a69-b0d8-2c8f251826a8-etc-selinux\") pod \"aws-ebs-csi-driver-node-vd9tq\" (UID: \"061de7ab-84ea-4a69-b0d8-2c8f251826a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd9tq" Apr 20 21:13:12.404164 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.404039 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/061de7ab-84ea-4a69-b0d8-2c8f251826a8-registration-dir\") pod \"aws-ebs-csi-driver-node-vd9tq\" (UID: \"061de7ab-84ea-4a69-b0d8-2c8f251826a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd9tq" Apr 20 21:13:12.404164 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.404056 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/061de7ab-84ea-4a69-b0d8-2c8f251826a8-device-dir\") pod \"aws-ebs-csi-driver-node-vd9tq\" (UID: \"061de7ab-84ea-4a69-b0d8-2c8f251826a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd9tq" Apr 20 21:13:12.404164 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.404082 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/061de7ab-84ea-4a69-b0d8-2c8f251826a8-sys-fs\") pod \"aws-ebs-csi-driver-node-vd9tq\" (UID: \"061de7ab-84ea-4a69-b0d8-2c8f251826a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd9tq" Apr 20 21:13:12.404164 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.404106 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/061de7ab-84ea-4a69-b0d8-2c8f251826a8-etc-selinux\") pod \"aws-ebs-csi-driver-node-vd9tq\" (UID: \"061de7ab-84ea-4a69-b0d8-2c8f251826a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd9tq" Apr 20 21:13:12.404164 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.404107 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vpzv8\" (UniqueName: \"kubernetes.io/projected/061de7ab-84ea-4a69-b0d8-2c8f251826a8-kube-api-access-vpzv8\") pod \"aws-ebs-csi-driver-node-vd9tq\" (UID: \"061de7ab-84ea-4a69-b0d8-2c8f251826a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd9tq" Apr 20 21:13:12.404164 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.404125 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/061de7ab-84ea-4a69-b0d8-2c8f251826a8-device-dir\") pod \"aws-ebs-csi-driver-node-vd9tq\" (UID: \"061de7ab-84ea-4a69-b0d8-2c8f251826a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd9tq" Apr 20 21:13:12.404164 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.404152 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/061de7ab-84ea-4a69-b0d8-2c8f251826a8-sys-fs\") pod \"aws-ebs-csi-driver-node-vd9tq\" (UID: \"061de7ab-84ea-4a69-b0d8-2c8f251826a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd9tq" Apr 20 21:13:12.404533 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.404218 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/061de7ab-84ea-4a69-b0d8-2c8f251826a8-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vd9tq\" (UID: \"061de7ab-84ea-4a69-b0d8-2c8f251826a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd9tq" Apr 20 21:13:12.404533 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.404218 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/061de7ab-84ea-4a69-b0d8-2c8f251826a8-socket-dir\") pod \"aws-ebs-csi-driver-node-vd9tq\" (UID: \"061de7ab-84ea-4a69-b0d8-2c8f251826a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd9tq" Apr 20 21:13:12.412208 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.412164 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpzv8\" (UniqueName: \"kubernetes.io/projected/061de7ab-84ea-4a69-b0d8-2c8f251826a8-kube-api-access-vpzv8\") pod \"aws-ebs-csi-driver-node-vd9tq\" (UID: \"061de7ab-84ea-4a69-b0d8-2c8f251826a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd9tq" Apr 20 21:13:12.490194 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.490157 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8mchz" Apr 20 21:13:12.490420 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.490397 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 21:13:12.498092 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.498072 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4dhhn" Apr 20 21:13:12.506904 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.506883 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:12.511755 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.511737 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9xgfk" Apr 20 21:13:12.517274 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.517254 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fwbvz" Apr 20 21:13:12.526820 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.526804 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rhd4c" Apr 20 21:13:12.533369 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.533348 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" Apr 20 21:13:12.538941 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.538923 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-nztmh" Apr 20 21:13:12.544491 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.544473 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd9tq" Apr 20 21:13:12.806477 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.806436 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89e3c54c-a866-4c9b-940d-54a417b5c964-metrics-certs\") pod \"network-metrics-daemon-fk9cw\" (UID: \"89e3c54c-a866-4c9b-940d-54a417b5c964\") " pod="openshift-multus/network-metrics-daemon-fk9cw" Apr 20 21:13:12.806647 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:12.806611 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:13:12.806717 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:12.806679 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89e3c54c-a866-4c9b-940d-54a417b5c964-metrics-certs podName:89e3c54c-a866-4c9b-940d-54a417b5c964 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:13.806659404 +0000 UTC m=+4.049618502 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/89e3c54c-a866-4c9b-940d-54a417b5c964-metrics-certs") pod "network-metrics-daemon-fk9cw" (UID: "89e3c54c-a866-4c9b-940d-54a417b5c964") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:13:12.859128 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:12.859095 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a5af7de_f9f6_43b2_aa6a_2a145cafbfb9.slice/crio-30c805e1e26e7bdaa2ffcc5ec0abb829eea5038ad8beac0cf5d5cd74a9099f9d WatchSource:0}: Error finding container 30c805e1e26e7bdaa2ffcc5ec0abb829eea5038ad8beac0cf5d5cd74a9099f9d: Status 404 returned error can't find the container with id 30c805e1e26e7bdaa2ffcc5ec0abb829eea5038ad8beac0cf5d5cd74a9099f9d Apr 20 21:13:12.861272 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:12.861131 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86a41be0_f642_425a_a950_24cf589ab648.slice/crio-9f370bab0b117d3aa07c358c5e9344ac16ea9ec4910822f562c28c5e7ff36df3 WatchSource:0}: Error finding container 9f370bab0b117d3aa07c358c5e9344ac16ea9ec4910822f562c28c5e7ff36df3: Status 404 returned error can't find the container with id 9f370bab0b117d3aa07c358c5e9344ac16ea9ec4910822f562c28c5e7ff36df3 Apr 20 21:13:12.862832 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:12.862810 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82c75868_1659_4814_b726_ba733f5f2ebc.slice/crio-0960432cbcb567268f1b225e2604ddb437d081b9c2f952354129eb8365494572 WatchSource:0}: Error finding container 0960432cbcb567268f1b225e2604ddb437d081b9c2f952354129eb8365494572: Status 404 returned error can't find the container with id 0960432cbcb567268f1b225e2604ddb437d081b9c2f952354129eb8365494572 Apr 20 21:13:12.863535 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:12.863500 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47953ca1_cc2f_4035_8d59_26be8c7a9516.slice/crio-403a9f02cd5fb3d6260b274f7293e57606606aa4760ce588597c1fab424423d4 WatchSource:0}: Error finding container 403a9f02cd5fb3d6260b274f7293e57606606aa4760ce588597c1fab424423d4: Status 404 returned error can't find the container with id 403a9f02cd5fb3d6260b274f7293e57606606aa4760ce588597c1fab424423d4 Apr 20 21:13:12.864844 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:12.864813 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee2db464_93af_49a5_a30a_e8119e1eac63.slice/crio-e75ac180c63b235312c2f056b85cb1a5abbb12b0c8576fd1cbc2a2855704b012 WatchSource:0}: Error finding container e75ac180c63b235312c2f056b85cb1a5abbb12b0c8576fd1cbc2a2855704b012: Status 404 returned error can't find the container with id e75ac180c63b235312c2f056b85cb1a5abbb12b0c8576fd1cbc2a2855704b012 Apr 20 21:13:12.865862 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:12.865839 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod061de7ab_84ea_4a69_b0d8_2c8f251826a8.slice/crio-0982134e6d820f12ce3ebc89a042e8d0c5a669135484ef0ecc1fc9a454810ec7 WatchSource:0}: Error finding container 0982134e6d820f12ce3ebc89a042e8d0c5a669135484ef0ecc1fc9a454810ec7: Status 404 returned error can't find the container with id 0982134e6d820f12ce3ebc89a042e8d0c5a669135484ef0ecc1fc9a454810ec7 Apr 20 21:13:12.867401 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:12.866723 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf801c74_93a7_4e27_ba8a_0c31596e95c6.slice/crio-e0076ea23d98d1f5b8cdcd81440b737637f7b06e1a7a6cbccac192b140a5c4ed WatchSource:0}: Error finding container e0076ea23d98d1f5b8cdcd81440b737637f7b06e1a7a6cbccac192b140a5c4ed: Status 404 returned error can't find the container with id e0076ea23d98d1f5b8cdcd81440b737637f7b06e1a7a6cbccac192b140a5c4ed Apr 20 21:13:12.870550 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:12.870510 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95826e15_25fd_44ed_bc3e_c54baaa50bb7.slice/crio-b7ecd7bd36afbce395ae9036e1087d83b07439b5cb9340d507c9b4154b9fb91f WatchSource:0}: Error finding container b7ecd7bd36afbce395ae9036e1087d83b07439b5cb9340d507c9b4154b9fb91f: Status 404 returned error can't find the container with id b7ecd7bd36afbce395ae9036e1087d83b07439b5cb9340d507c9b4154b9fb91f Apr 20 21:13:12.907043 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:12.907020 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stvbx\" (UniqueName: \"kubernetes.io/projected/369b8c8d-720a-4d32-a69a-64bd50a8103a-kube-api-access-stvbx\") pod \"network-check-target-cxz7h\" (UID: \"369b8c8d-720a-4d32-a69a-64bd50a8103a\") " pod="openshift-network-diagnostics/network-check-target-cxz7h" Apr 20 21:13:12.907138 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:12.907126 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 21:13:12.907172 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:12.907141 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 21:13:12.907172 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:12.907150 2571 projected.go:194] Error preparing data for projected volume kube-api-access-stvbx for pod openshift-network-diagnostics/network-check-target-cxz7h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:13:12.907264 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:12.907214 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/369b8c8d-720a-4d32-a69a-64bd50a8103a-kube-api-access-stvbx podName:369b8c8d-720a-4d32-a69a-64bd50a8103a nodeName:}" failed. No retries permitted until 2026-04-20 21:13:13.907200903 +0000 UTC m=+4.150160001 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-stvbx" (UniqueName: "kubernetes.io/projected/369b8c8d-720a-4d32-a69a-64bd50a8103a-kube-api-access-stvbx") pod "network-check-target-cxz7h" (UID: "369b8c8d-720a-4d32-a69a-64bd50a8103a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:13:13.227421 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:13.227298 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 21:08:11 +0000 UTC" deadline="2027-11-17 09:23:55.882265306 +0000 UTC" Apr 20 21:13:13.227421 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:13.227336 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13812h10m42.654933327s" Apr 20 21:13:13.368459 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:13.368418 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-149.ec2.internal" event={"ID":"bc561987305693c23e8fd4b20c60c28f","Type":"ContainerStarted","Data":"a9bf6274467622177c8431891ff4f4434dd9023edd40c170670a4c9b50942712"} Apr 20 21:13:13.395423 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:13.395385 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8mchz" event={"ID":"cf801c74-93a7-4e27-ba8a-0c31596e95c6","Type":"ContainerStarted","Data":"e0076ea23d98d1f5b8cdcd81440b737637f7b06e1a7a6cbccac192b140a5c4ed"} Apr 20 21:13:13.403716 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:13.403662 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fwbvz" event={"ID":"95826e15-25fd-44ed-bc3e-c54baaa50bb7","Type":"ContainerStarted","Data":"b7ecd7bd36afbce395ae9036e1087d83b07439b5cb9340d507c9b4154b9fb91f"} Apr 20 21:13:13.412324 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:13.412299 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4dhhn" event={"ID":"1931e219-0173-47c5-a78a-7401b997716c","Type":"ContainerStarted","Data":"83fe784727937d64228c4377c45ef648a5390f9cd7fe3c1838e8e15c8e65975b"} Apr 20 21:13:13.418764 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:13.418704 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" event={"ID":"ee2db464-93af-49a5-a30a-e8119e1eac63","Type":"ContainerStarted","Data":"e75ac180c63b235312c2f056b85cb1a5abbb12b0c8576fd1cbc2a2855704b012"} Apr 20 21:13:13.423563 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:13.423540 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd9tq" event={"ID":"061de7ab-84ea-4a69-b0d8-2c8f251826a8","Type":"ContainerStarted","Data":"0982134e6d820f12ce3ebc89a042e8d0c5a669135484ef0ecc1fc9a454810ec7"} Apr 20 21:13:13.434589 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:13.434559 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9xgfk" event={"ID":"47953ca1-cc2f-4035-8d59-26be8c7a9516","Type":"ContainerStarted","Data":"403a9f02cd5fb3d6260b274f7293e57606606aa4760ce588597c1fab424423d4"} Apr 20 21:13:13.437158 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:13.437098 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" event={"ID":"82c75868-1659-4814-b726-ba733f5f2ebc","Type":"ContainerStarted","Data":"0960432cbcb567268f1b225e2604ddb437d081b9c2f952354129eb8365494572"} Apr 20 21:13:13.442868 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:13.442803 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-nztmh" event={"ID":"86a41be0-f642-425a-a950-24cf589ab648","Type":"ContainerStarted","Data":"9f370bab0b117d3aa07c358c5e9344ac16ea9ec4910822f562c28c5e7ff36df3"} Apr 20 21:13:13.455099 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:13.455071 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rhd4c" event={"ID":"6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9","Type":"ContainerStarted","Data":"30c805e1e26e7bdaa2ffcc5ec0abb829eea5038ad8beac0cf5d5cd74a9099f9d"} Apr 20 21:13:13.814564 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:13.814489 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89e3c54c-a866-4c9b-940d-54a417b5c964-metrics-certs\") pod \"network-metrics-daemon-fk9cw\" (UID: \"89e3c54c-a866-4c9b-940d-54a417b5c964\") " pod="openshift-multus/network-metrics-daemon-fk9cw" Apr 20 21:13:13.814723 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:13.814640 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:13:13.814723 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:13.814702 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89e3c54c-a866-4c9b-940d-54a417b5c964-metrics-certs podName:89e3c54c-a866-4c9b-940d-54a417b5c964 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:15.814682153 +0000 UTC m=+6.057641256 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/89e3c54c-a866-4c9b-940d-54a417b5c964-metrics-certs") pod "network-metrics-daemon-fk9cw" (UID: "89e3c54c-a866-4c9b-940d-54a417b5c964") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:13:13.915211 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:13.915153 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stvbx\" (UniqueName: \"kubernetes.io/projected/369b8c8d-720a-4d32-a69a-64bd50a8103a-kube-api-access-stvbx\") pod \"network-check-target-cxz7h\" (UID: \"369b8c8d-720a-4d32-a69a-64bd50a8103a\") " pod="openshift-network-diagnostics/network-check-target-cxz7h" Apr 20 21:13:13.915371 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:13.915317 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 21:13:13.915371 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:13.915336 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 21:13:13.915371 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:13.915347 2571 projected.go:194] Error preparing data for projected volume kube-api-access-stvbx for pod openshift-network-diagnostics/network-check-target-cxz7h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:13:13.915535 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:13.915402 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/369b8c8d-720a-4d32-a69a-64bd50a8103a-kube-api-access-stvbx podName:369b8c8d-720a-4d32-a69a-64bd50a8103a nodeName:}" failed. No retries permitted until 2026-04-20 21:13:15.915386322 +0000 UTC m=+6.158345420 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-stvbx" (UniqueName: "kubernetes.io/projected/369b8c8d-720a-4d32-a69a-64bd50a8103a-kube-api-access-stvbx") pod "network-check-target-cxz7h" (UID: "369b8c8d-720a-4d32-a69a-64bd50a8103a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:13:14.340031 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:14.339085 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fk9cw" Apr 20 21:13:14.340031 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:14.339251 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fk9cw" podUID="89e3c54c-a866-4c9b-940d-54a417b5c964" Apr 20 21:13:14.340031 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:14.339660 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxz7h" Apr 20 21:13:14.340031 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:14.339749 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxz7h" podUID="369b8c8d-720a-4d32-a69a-64bd50a8103a" Apr 20 21:13:14.487808 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:14.487051 2571 generic.go:358] "Generic (PLEG): container finished" podID="d4f43f5b0dc2bc0e90bb3cf3edee48fe" containerID="a5bb05bdcd816ffab117aba3fd71e84745772e91939e1a634d0b5fd26e6b6f5b" exitCode=0 Apr 20 21:13:14.487808 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:14.487600 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-149.ec2.internal" event={"ID":"d4f43f5b0dc2bc0e90bb3cf3edee48fe","Type":"ContainerDied","Data":"a5bb05bdcd816ffab117aba3fd71e84745772e91939e1a634d0b5fd26e6b6f5b"} Apr 20 21:13:14.503727 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:14.503671 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-149.ec2.internal" podStartSLOduration=3.503654206 podStartE2EDuration="3.503654206s" podCreationTimestamp="2026-04-20 21:13:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:13:13.393102799 +0000 UTC m=+3.636061921" watchObservedRunningTime="2026-04-20 21:13:14.503654206 +0000 UTC m=+4.746613328" Apr 20 21:13:15.496146 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:15.496112 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-149.ec2.internal" event={"ID":"d4f43f5b0dc2bc0e90bb3cf3edee48fe","Type":"ContainerStarted","Data":"602af908fc55d63ec36ec1a315c1f25bd888c612586d47421355f650aab3d627"} Apr 20 21:13:15.830893 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:15.830807 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89e3c54c-a866-4c9b-940d-54a417b5c964-metrics-certs\") pod \"network-metrics-daemon-fk9cw\" (UID: \"89e3c54c-a866-4c9b-940d-54a417b5c964\") " pod="openshift-multus/network-metrics-daemon-fk9cw" Apr 20 21:13:15.831050 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:15.830976 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:13:15.831112 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:15.831060 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89e3c54c-a866-4c9b-940d-54a417b5c964-metrics-certs podName:89e3c54c-a866-4c9b-940d-54a417b5c964 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:19.831038447 +0000 UTC m=+10.073997545 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/89e3c54c-a866-4c9b-940d-54a417b5c964-metrics-certs") pod "network-metrics-daemon-fk9cw" (UID: "89e3c54c-a866-4c9b-940d-54a417b5c964") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:13:15.931823 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:15.931784 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stvbx\" (UniqueName: \"kubernetes.io/projected/369b8c8d-720a-4d32-a69a-64bd50a8103a-kube-api-access-stvbx\") pod \"network-check-target-cxz7h\" (UID: \"369b8c8d-720a-4d32-a69a-64bd50a8103a\") " pod="openshift-network-diagnostics/network-check-target-cxz7h" Apr 20 21:13:15.932527 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:15.932008 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 21:13:15.932527 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:15.932030 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 21:13:15.932527 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:15.932042 2571 projected.go:194] Error preparing data for projected volume kube-api-access-stvbx for pod openshift-network-diagnostics/network-check-target-cxz7h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:13:15.932527 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:15.932113 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/369b8c8d-720a-4d32-a69a-64bd50a8103a-kube-api-access-stvbx podName:369b8c8d-720a-4d32-a69a-64bd50a8103a nodeName:}" failed. No retries permitted until 2026-04-20 21:13:19.932093988 +0000 UTC m=+10.175053120 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-stvbx" (UniqueName: "kubernetes.io/projected/369b8c8d-720a-4d32-a69a-64bd50a8103a-kube-api-access-stvbx") pod "network-check-target-cxz7h" (UID: "369b8c8d-720a-4d32-a69a-64bd50a8103a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:13:16.337228 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:16.337191 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxz7h" Apr 20 21:13:16.337392 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:16.337328 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxz7h" podUID="369b8c8d-720a-4d32-a69a-64bd50a8103a" Apr 20 21:13:16.341287 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:16.338392 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fk9cw" Apr 20 21:13:16.341287 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:16.338557 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fk9cw" podUID="89e3c54c-a866-4c9b-940d-54a417b5c964" Apr 20 21:13:18.338812 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:18.338332 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxz7h" Apr 20 21:13:18.338812 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:18.338380 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fk9cw" Apr 20 21:13:18.338812 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:18.338467 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxz7h" podUID="369b8c8d-720a-4d32-a69a-64bd50a8103a" Apr 20 21:13:18.338812 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:18.338572 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fk9cw" podUID="89e3c54c-a866-4c9b-940d-54a417b5c964" Apr 20 21:13:19.866843 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:19.866810 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89e3c54c-a866-4c9b-940d-54a417b5c964-metrics-certs\") pod \"network-metrics-daemon-fk9cw\" (UID: \"89e3c54c-a866-4c9b-940d-54a417b5c964\") " pod="openshift-multus/network-metrics-daemon-fk9cw" Apr 20 21:13:19.867312 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:19.866962 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:13:19.867312 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:19.867034 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89e3c54c-a866-4c9b-940d-54a417b5c964-metrics-certs podName:89e3c54c-a866-4c9b-940d-54a417b5c964 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:27.867014628 +0000 UTC m=+18.109973743 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/89e3c54c-a866-4c9b-940d-54a417b5c964-metrics-certs") pod "network-metrics-daemon-fk9cw" (UID: "89e3c54c-a866-4c9b-940d-54a417b5c964") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:13:19.968081 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:19.967449 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stvbx\" (UniqueName: \"kubernetes.io/projected/369b8c8d-720a-4d32-a69a-64bd50a8103a-kube-api-access-stvbx\") pod \"network-check-target-cxz7h\" (UID: \"369b8c8d-720a-4d32-a69a-64bd50a8103a\") " pod="openshift-network-diagnostics/network-check-target-cxz7h" Apr 20 21:13:19.968081 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:19.967629 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 21:13:19.968081 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:19.967650 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 21:13:19.968081 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:19.967664 2571 projected.go:194] Error preparing data for projected volume kube-api-access-stvbx for pod openshift-network-diagnostics/network-check-target-cxz7h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:13:19.968081 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:19.967717 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/369b8c8d-720a-4d32-a69a-64bd50a8103a-kube-api-access-stvbx podName:369b8c8d-720a-4d32-a69a-64bd50a8103a nodeName:}" failed. No retries permitted until 2026-04-20 21:13:27.967703 +0000 UTC m=+18.210662097 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-stvbx" (UniqueName: "kubernetes.io/projected/369b8c8d-720a-4d32-a69a-64bd50a8103a-kube-api-access-stvbx") pod "network-check-target-cxz7h" (UID: "369b8c8d-720a-4d32-a69a-64bd50a8103a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:13:20.337980 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:20.337943 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fk9cw" Apr 20 21:13:20.346357 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:20.346317 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fk9cw" podUID="89e3c54c-a866-4c9b-940d-54a417b5c964" Apr 20 21:13:20.346890 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:20.346873 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxz7h" Apr 20 21:13:20.347015 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:20.346999 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxz7h" podUID="369b8c8d-720a-4d32-a69a-64bd50a8103a" Apr 20 21:13:22.337370 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:22.337290 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fk9cw" Apr 20 21:13:22.337370 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:22.337308 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxz7h" Apr 20 21:13:22.337826 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:22.337409 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxz7h" podUID="369b8c8d-720a-4d32-a69a-64bd50a8103a" Apr 20 21:13:22.337826 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:22.337527 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fk9cw" podUID="89e3c54c-a866-4c9b-940d-54a417b5c964" Apr 20 21:13:24.340258 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:24.340221 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxz7h" Apr 20 21:13:24.340664 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:24.340235 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fk9cw" Apr 20 21:13:24.340664 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:24.340331 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxz7h" podUID="369b8c8d-720a-4d32-a69a-64bd50a8103a" Apr 20 21:13:24.340664 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:24.340433 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fk9cw" podUID="89e3c54c-a866-4c9b-940d-54a417b5c964" Apr 20 21:13:26.337075 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:26.337039 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fk9cw" Apr 20 21:13:26.337075 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:26.337076 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxz7h" Apr 20 21:13:26.337612 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:26.337191 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fk9cw" podUID="89e3c54c-a866-4c9b-940d-54a417b5c964" Apr 20 21:13:26.337612 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:26.337301 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxz7h" podUID="369b8c8d-720a-4d32-a69a-64bd50a8103a" Apr 20 21:13:27.925368 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:27.925320 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89e3c54c-a866-4c9b-940d-54a417b5c964-metrics-certs\") pod \"network-metrics-daemon-fk9cw\" (UID: \"89e3c54c-a866-4c9b-940d-54a417b5c964\") " pod="openshift-multus/network-metrics-daemon-fk9cw" Apr 20 21:13:27.925788 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:27.925448 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:13:27.925788 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:27.925513 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89e3c54c-a866-4c9b-940d-54a417b5c964-metrics-certs podName:89e3c54c-a866-4c9b-940d-54a417b5c964 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:43.925498788 +0000 UTC m=+34.168457890 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/89e3c54c-a866-4c9b-940d-54a417b5c964-metrics-certs") pod "network-metrics-daemon-fk9cw" (UID: "89e3c54c-a866-4c9b-940d-54a417b5c964") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:13:28.026286 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:28.026236 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stvbx\" (UniqueName: \"kubernetes.io/projected/369b8c8d-720a-4d32-a69a-64bd50a8103a-kube-api-access-stvbx\") pod \"network-check-target-cxz7h\" (UID: \"369b8c8d-720a-4d32-a69a-64bd50a8103a\") " pod="openshift-network-diagnostics/network-check-target-cxz7h" Apr 20 21:13:28.026454 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:28.026402 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 21:13:28.026454 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:28.026419 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 21:13:28.026454 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:28.026428 2571 projected.go:194] Error preparing data for projected volume kube-api-access-stvbx for pod openshift-network-diagnostics/network-check-target-cxz7h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:13:28.026572 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:28.026479 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/369b8c8d-720a-4d32-a69a-64bd50a8103a-kube-api-access-stvbx podName:369b8c8d-720a-4d32-a69a-64bd50a8103a nodeName:}" failed. No retries permitted until 2026-04-20 21:13:44.026465579 +0000 UTC m=+34.269424682 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-stvbx" (UniqueName: "kubernetes.io/projected/369b8c8d-720a-4d32-a69a-64bd50a8103a-kube-api-access-stvbx") pod "network-check-target-cxz7h" (UID: "369b8c8d-720a-4d32-a69a-64bd50a8103a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:13:28.337339 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:28.337307 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxz7h" Apr 20 21:13:28.337575 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:28.337419 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxz7h" podUID="369b8c8d-720a-4d32-a69a-64bd50a8103a" Apr 20 21:13:28.337575 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:28.337515 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fk9cw" Apr 20 21:13:28.337661 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:28.337640 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fk9cw" podUID="89e3c54c-a866-4c9b-940d-54a417b5c964" Apr 20 21:13:30.338704 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:30.337941 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fk9cw" Apr 20 21:13:30.338704 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:30.338366 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxz7h" Apr 20 21:13:30.338704 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:30.338438 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxz7h" podUID="369b8c8d-720a-4d32-a69a-64bd50a8103a" Apr 20 21:13:30.339399 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:30.339343 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fk9cw" podUID="89e3c54c-a866-4c9b-940d-54a417b5c964" Apr 20 21:13:30.520278 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:30.520207 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fwbvz" event={"ID":"95826e15-25fd-44ed-bc3e-c54baaa50bb7","Type":"ContainerStarted","Data":"989a93e9f76fd0eacc48a647cb2d8aa89b544652d7bb855e76e40f5c4e0dcbb0"} Apr 20 21:13:30.521676 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:30.521509 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" event={"ID":"ee2db464-93af-49a5-a30a-e8119e1eac63","Type":"ContainerStarted","Data":"36c1e0875ede892eed50d4a0d426353a7b88d03a0643912d9f97895b8c9beee1"} Apr 20 21:13:30.522917 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:30.522833 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd9tq" event={"ID":"061de7ab-84ea-4a69-b0d8-2c8f251826a8","Type":"ContainerStarted","Data":"a7e32a04f6b32167b798f6c4ae1499d26549a1de65a25c0e5b88e248dc3da14b"} Apr 20 21:13:30.524215 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:30.524191 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9xgfk" event={"ID":"47953ca1-cc2f-4035-8d59-26be8c7a9516","Type":"ContainerStarted","Data":"a89656fb4a843b1070b30f14c9691cc57879c3d8ec35bd7e85ba78de7b90d98b"} Apr 20 21:13:30.525456 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:30.525424 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" event={"ID":"82c75868-1659-4814-b726-ba733f5f2ebc","Type":"ContainerStarted","Data":"bfc8225d1a85e99d91ce655703407db519135e04947061163cb8f47c3207156b"} Apr 20 21:13:30.527527 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:30.527487 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-nztmh" event={"ID":"86a41be0-f642-425a-a950-24cf589ab648","Type":"ContainerStarted","Data":"554705588a23ce073b89e1dfe37fc46ccb4896b993f32eb5e3aac22ece7d8127"} Apr 20 21:13:30.528963 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:30.528940 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rhd4c" event={"ID":"6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9","Type":"ContainerStarted","Data":"34b520c537c3f3dae631dab0f0e93013b73730ade107631140d08b9231df596a"} Apr 20 21:13:30.530245 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:30.530220 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8mchz" event={"ID":"cf801c74-93a7-4e27-ba8a-0c31596e95c6","Type":"ContainerStarted","Data":"b536237a5554b9ff1ecdac7699c22806ecee8fb7b13d16376a8ae65c3aaebf8d"} Apr 20 21:13:30.533430 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:30.533393 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-fwbvz" podStartSLOduration=3.300622285 podStartE2EDuration="20.533381193s" podCreationTimestamp="2026-04-20 21:13:10 +0000 UTC" firstStartedPulling="2026-04-20 21:13:12.873455228 +0000 UTC m=+3.116414336" lastFinishedPulling="2026-04-20 21:13:30.106214147 +0000 UTC m=+20.349173244" observedRunningTime="2026-04-20 21:13:30.533148849 +0000 UTC m=+20.776107970" watchObservedRunningTime="2026-04-20 21:13:30.533381193 +0000 UTC m=+20.776340313" Apr 20 21:13:30.533535 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:30.533476 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-149.ec2.internal" podStartSLOduration=19.533469773 podStartE2EDuration="19.533469773s" podCreationTimestamp="2026-04-20 21:13:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:13:15.510867425 +0000 UTC m=+5.753826545" watchObservedRunningTime="2026-04-20 21:13:30.533469773 +0000 UTC m=+20.776428895" Apr 20 21:13:30.576865 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:30.575559 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-8mchz" podStartSLOduration=3.275222146 podStartE2EDuration="20.575542371s" podCreationTimestamp="2026-04-20 21:13:10 +0000 UTC" firstStartedPulling="2026-04-20 21:13:12.87372796 +0000 UTC m=+3.116687062" lastFinishedPulling="2026-04-20 21:13:30.174048187 +0000 UTC m=+20.417007287" observedRunningTime="2026-04-20 21:13:30.575231406 +0000 UTC m=+20.818190527" watchObservedRunningTime="2026-04-20 21:13:30.575542371 +0000 UTC m=+20.818501493" Apr 20 21:13:30.594089 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:30.594041 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-6rjhj" podStartSLOduration=3.358501033 podStartE2EDuration="20.594025183s" podCreationTimestamp="2026-04-20 21:13:10 +0000 UTC" firstStartedPulling="2026-04-20 21:13:12.870734921 +0000 UTC m=+3.113694028" lastFinishedPulling="2026-04-20 21:13:30.10625907 +0000 UTC m=+20.349218178" observedRunningTime="2026-04-20 21:13:30.593883386 +0000 UTC m=+20.836842507" watchObservedRunningTime="2026-04-20 21:13:30.594025183 +0000 UTC m=+20.836984302" Apr 20 21:13:30.606932 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:30.606893 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-nztmh" podStartSLOduration=3.364493387 podStartE2EDuration="20.606879754s" podCreationTimestamp="2026-04-20 21:13:10 +0000 UTC" firstStartedPulling="2026-04-20 21:13:12.863874576 +0000 UTC m=+3.106833691" lastFinishedPulling="2026-04-20 21:13:30.106260958 +0000 UTC m=+20.349220058" observedRunningTime="2026-04-20 21:13:30.606675375 +0000 UTC m=+20.849634495" watchObservedRunningTime="2026-04-20 21:13:30.606879754 +0000 UTC m=+20.849838873" Apr 20 21:13:31.394382 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:31.394214 2571 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 21:13:31.533343 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:31.533250 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd9tq" event={"ID":"061de7ab-84ea-4a69-b0d8-2c8f251826a8","Type":"ContainerStarted","Data":"ab1633feb0a58f318e6aa05a08a8d76457962baea7e7a93f92e30568e1c674e5"} Apr 20 21:13:31.535500 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:31.535475 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6skp_82c75868-1659-4814-b726-ba733f5f2ebc/ovn-acl-logging/0.log" Apr 20 21:13:31.535813 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:31.535791 2571 generic.go:358] "Generic (PLEG): container finished" podID="82c75868-1659-4814-b726-ba733f5f2ebc" containerID="e18771a971ec7e53ea02a8aee736258f7a1c2aea571f282c784f43b347281c4d" exitCode=1 Apr 20 21:13:31.535886 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:31.535863 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" event={"ID":"82c75868-1659-4814-b726-ba733f5f2ebc","Type":"ContainerStarted","Data":"d1247a6b71a5cd777f18d5d799ce805e8f50dec8849bfd9ac411f83b10cfb5fc"} Apr 20 21:13:31.535938 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:31.535899 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" event={"ID":"82c75868-1659-4814-b726-ba733f5f2ebc","Type":"ContainerStarted","Data":"13c8c08f3b6351e1763a611130546333eb31dbf307bb043d5cb000b73737c128"} Apr 20 21:13:31.535938 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:31.535913 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" event={"ID":"82c75868-1659-4814-b726-ba733f5f2ebc","Type":"ContainerStarted","Data":"79c893e94ad01625bd5f265e2b3539d7667df6c44ab9291a049ed36d47e42332"} Apr 20 21:13:31.535938 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:31.535923 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" event={"ID":"82c75868-1659-4814-b726-ba733f5f2ebc","Type":"ContainerStarted","Data":"3a9fc2cbf919699b9f13813d694ef7202d7d3907f05b5f3561c76330575f58f5"} Apr 20 21:13:31.535938 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:31.535932 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" event={"ID":"82c75868-1659-4814-b726-ba733f5f2ebc","Type":"ContainerDied","Data":"e18771a971ec7e53ea02a8aee736258f7a1c2aea571f282c784f43b347281c4d"} Apr 20 21:13:31.537221 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:31.537197 2571 generic.go:358] "Generic (PLEG): container finished" podID="6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9" containerID="34b520c537c3f3dae631dab0f0e93013b73730ade107631140d08b9231df596a" exitCode=0 Apr 20 21:13:31.537351 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:31.537304 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rhd4c" event={"ID":"6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9","Type":"ContainerDied","Data":"34b520c537c3f3dae631dab0f0e93013b73730ade107631140d08b9231df596a"} Apr 20 21:13:31.553770 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:31.553726 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9xgfk" podStartSLOduration=4.240135016 podStartE2EDuration="21.553709507s" podCreationTimestamp="2026-04-20 21:13:10 +0000 UTC" firstStartedPulling="2026-04-20 21:13:12.866099798 +0000 UTC m=+3.109058901" lastFinishedPulling="2026-04-20 21:13:30.17967429 +0000 UTC m=+20.422633392" observedRunningTime="2026-04-20 21:13:31.553062077 +0000 UTC m=+21.796021197" watchObservedRunningTime="2026-04-20 21:13:31.553709507 +0000 UTC m=+21.796668627" Apr 20 21:13:32.267159 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:32.267004 2571 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T21:13:31.394375525Z","UUID":"031a6146-c2e5-4808-a40f-989ef530b830","Handler":null,"Name":"","Endpoint":""} Apr 20 21:13:32.269828 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:32.269805 2571 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 21:13:32.269976 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:32.269836 2571 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 21:13:32.337062 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:32.337030 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxz7h" Apr 20 21:13:32.337062 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:32.337053 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fk9cw" Apr 20 21:13:32.337300 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:32.337200 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxz7h" podUID="369b8c8d-720a-4d32-a69a-64bd50a8103a" Apr 20 21:13:32.337360 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:32.337330 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fk9cw" podUID="89e3c54c-a866-4c9b-940d-54a417b5c964" Apr 20 21:13:32.540464 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:32.540432 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4dhhn" event={"ID":"1931e219-0173-47c5-a78a-7401b997716c","Type":"ContainerStarted","Data":"5a061d6b2cd349b7e432cc614b886f1d40a9c68d9e979975b6e24ef653aea85a"} Apr 20 21:13:32.555961 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:32.555911 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-4dhhn" podStartSLOduration=5.271790988 podStartE2EDuration="22.555895842s" podCreationTimestamp="2026-04-20 21:13:10 +0000 UTC" firstStartedPulling="2026-04-20 21:13:12.873702913 +0000 UTC m=+3.116662016" lastFinishedPulling="2026-04-20 21:13:30.157807761 +0000 UTC m=+20.400766870" observedRunningTime="2026-04-20 21:13:32.555425636 +0000 UTC m=+22.798384746" watchObservedRunningTime="2026-04-20 21:13:32.555895842 +0000 UTC m=+22.798854961" Apr 20 21:13:33.544992 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:33.544963 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6skp_82c75868-1659-4814-b726-ba733f5f2ebc/ovn-acl-logging/0.log" Apr 20 21:13:33.545587 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:33.545480 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" event={"ID":"82c75868-1659-4814-b726-ba733f5f2ebc","Type":"ContainerStarted","Data":"0107b806ad551af3bdbf944d6fdf3ebbf98d1045879e1b593d62854e73b9aafd"} Apr 20 21:13:33.547474 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:33.547419 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd9tq" event={"ID":"061de7ab-84ea-4a69-b0d8-2c8f251826a8","Type":"ContainerStarted","Data":"63c061d38fbff0b62d4ed805c3ebd36cdd4dbe77e18974cae63f5bbb1407793c"} Apr 20 21:13:33.564196 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:33.564145 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd9tq" podStartSLOduration=3.9175807049999998 podStartE2EDuration="23.564128713s" podCreationTimestamp="2026-04-20 21:13:10 +0000 UTC" firstStartedPulling="2026-04-20 21:13:12.870794292 +0000 UTC m=+3.113753390" lastFinishedPulling="2026-04-20 21:13:32.517342287 +0000 UTC m=+22.760301398" observedRunningTime="2026-04-20 21:13:33.563153587 +0000 UTC m=+23.806112738" watchObservedRunningTime="2026-04-20 21:13:33.564128713 +0000 UTC m=+23.807087897" Apr 20 21:13:33.840782 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:33.840693 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-nztmh" Apr 20 21:13:33.841402 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:33.841379 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-nztmh" Apr 20 21:13:34.336834 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:34.336797 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxz7h" Apr 20 21:13:34.336994 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:34.336915 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxz7h" podUID="369b8c8d-720a-4d32-a69a-64bd50a8103a" Apr 20 21:13:34.336994 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:34.336976 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fk9cw" Apr 20 21:13:34.337123 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:34.337102 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fk9cw" podUID="89e3c54c-a866-4c9b-940d-54a417b5c964" Apr 20 21:13:35.206696 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:35.206662 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-nztmh" Apr 20 21:13:35.208248 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:35.208224 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-nztmh" Apr 20 21:13:36.336930 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:36.336725 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fk9cw" Apr 20 21:13:36.337439 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:36.336730 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxz7h" Apr 20 21:13:36.337439 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:36.337018 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fk9cw" podUID="89e3c54c-a866-4c9b-940d-54a417b5c964" Apr 20 21:13:36.337439 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:36.337105 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxz7h" podUID="369b8c8d-720a-4d32-a69a-64bd50a8103a" Apr 20 21:13:36.554772 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:36.554743 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6skp_82c75868-1659-4814-b726-ba733f5f2ebc/ovn-acl-logging/0.log" Apr 20 21:13:36.555161 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:36.555138 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" event={"ID":"82c75868-1659-4814-b726-ba733f5f2ebc","Type":"ContainerStarted","Data":"2f8d85840e736a9f32939ed6faa0527e7918460af64356b335b171ea51e6e28f"} Apr 20 21:13:36.555453 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:36.555434 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:36.555558 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:36.555463 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:36.555645 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:36.555628 2571 scope.go:117] "RemoveContainer" containerID="e18771a971ec7e53ea02a8aee736258f7a1c2aea571f282c784f43b347281c4d" Apr 20 21:13:36.556891 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:36.556874 2571 generic.go:358] "Generic (PLEG): container finished" podID="6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9" containerID="c1f983653eca9f69b7c7928d1bc6b0d6df6979af0b800d11b30f9d3b907cfb3b" exitCode=0 Apr 20 21:13:36.556993 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:36.556967 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rhd4c" event={"ID":"6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9","Type":"ContainerDied","Data":"c1f983653eca9f69b7c7928d1bc6b0d6df6979af0b800d11b30f9d3b907cfb3b"} Apr 20 21:13:36.572711 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:36.572694 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:37.497427 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:37.497249 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-cxz7h"] Apr 20 21:13:37.497974 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:37.497536 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxz7h" Apr 20 21:13:37.497974 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:37.497648 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxz7h" podUID="369b8c8d-720a-4d32-a69a-64bd50a8103a" Apr 20 21:13:37.501961 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:37.501936 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fk9cw"] Apr 20 21:13:37.502053 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:37.502040 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fk9cw" Apr 20 21:13:37.502131 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:37.502113 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fk9cw" podUID="89e3c54c-a866-4c9b-940d-54a417b5c964" Apr 20 21:13:37.562409 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:37.562335 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6skp_82c75868-1659-4814-b726-ba733f5f2ebc/ovn-acl-logging/0.log" Apr 20 21:13:37.562718 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:37.562696 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" event={"ID":"82c75868-1659-4814-b726-ba733f5f2ebc","Type":"ContainerStarted","Data":"13f462f5000c42b4e9336840ac04eee43772b255182dc9b5c88dcd23c7cc24f4"} Apr 20 21:13:37.562936 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:37.562912 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:37.564828 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:37.564801 2571 generic.go:358] "Generic (PLEG): container finished" podID="6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9" containerID="212a8351187920fbeb30b012f1a2851a09da98316ba1eb32377e1059c092bdfd" exitCode=0 Apr 20 21:13:37.564931 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:37.564847 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rhd4c" event={"ID":"6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9","Type":"ContainerDied","Data":"212a8351187920fbeb30b012f1a2851a09da98316ba1eb32377e1059c092bdfd"} Apr 20 21:13:37.577869 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:37.577851 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:13:37.588345 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:37.588305 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" podStartSLOduration=10.238646796 podStartE2EDuration="27.588290349s" podCreationTimestamp="2026-04-20 21:13:10 +0000 UTC" firstStartedPulling="2026-04-20 21:13:12.864590619 +0000 UTC m=+3.107549718" lastFinishedPulling="2026-04-20 21:13:30.214234161 +0000 UTC m=+20.457193271" observedRunningTime="2026-04-20 21:13:37.587281364 +0000 UTC m=+27.830240483" watchObservedRunningTime="2026-04-20 21:13:37.588290349 +0000 UTC m=+27.831249469" Apr 20 21:13:38.572193 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:38.572128 2571 generic.go:358] "Generic (PLEG): container finished" podID="6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9" containerID="e8b5b8155c29292de7179505cbeec8b7acd0e065a1c91fc3379f715f4b557148" exitCode=0 Apr 20 21:13:38.572883 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:38.572211 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rhd4c" event={"ID":"6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9","Type":"ContainerDied","Data":"e8b5b8155c29292de7179505cbeec8b7acd0e065a1c91fc3379f715f4b557148"} Apr 20 21:13:39.336905 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:39.336872 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxz7h" Apr 20 21:13:39.337067 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:39.336872 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fk9cw" Apr 20 21:13:39.337067 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:39.337018 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxz7h" podUID="369b8c8d-720a-4d32-a69a-64bd50a8103a" Apr 20 21:13:39.337147 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:39.337103 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fk9cw" podUID="89e3c54c-a866-4c9b-940d-54a417b5c964" Apr 20 21:13:41.337468 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:41.337432 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxz7h" Apr 20 21:13:41.338046 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:41.337483 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fk9cw" Apr 20 21:13:41.338046 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:41.337560 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cxz7h" podUID="369b8c8d-720a-4d32-a69a-64bd50a8103a" Apr 20 21:13:41.338046 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:41.337674 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fk9cw" podUID="89e3c54c-a866-4c9b-940d-54a417b5c964" Apr 20 21:13:43.123887 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.123851 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-149.ec2.internal" event="NodeReady" Apr 20 21:13:43.124353 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.124013 2571 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 21:13:43.167926 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.167844 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-cqwmd"] Apr 20 21:13:43.170336 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.170291 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cqwmd" Apr 20 21:13:43.170864 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.170839 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-ld4fl"] Apr 20 21:13:43.172602 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.172581 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vtslr\"" Apr 20 21:13:43.172700 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.172585 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 21:13:43.172700 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.172617 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ld4fl" Apr 20 21:13:43.172700 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.172582 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 21:13:43.174826 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.174805 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-kkzr9\"" Apr 20 21:13:43.174985 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.174968 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 21:13:43.175058 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.175042 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 21:13:43.175284 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.175269 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 21:13:43.180335 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.180317 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cqwmd"] Apr 20 21:13:43.184028 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.184004 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ld4fl"] Apr 20 21:13:43.243256 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.243049 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/096e06e8-bf34-462d-9f43-fd87848fd09e-tmp-dir\") pod \"dns-default-cqwmd\" (UID: \"096e06e8-bf34-462d-9f43-fd87848fd09e\") " pod="openshift-dns/dns-default-cqwmd" Apr 20 21:13:43.243429 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.243283 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/096e06e8-bf34-462d-9f43-fd87848fd09e-config-volume\") pod \"dns-default-cqwmd\" (UID: \"096e06e8-bf34-462d-9f43-fd87848fd09e\") " pod="openshift-dns/dns-default-cqwmd" Apr 20 21:13:43.243429 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.243309 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/096e06e8-bf34-462d-9f43-fd87848fd09e-metrics-tls\") pod \"dns-default-cqwmd\" (UID: \"096e06e8-bf34-462d-9f43-fd87848fd09e\") " pod="openshift-dns/dns-default-cqwmd" Apr 20 21:13:43.243429 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.243356 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm4xp\" (UniqueName: \"kubernetes.io/projected/08619f25-e76b-45d3-ab4b-8e9490d505f9-kube-api-access-pm4xp\") pod \"ingress-canary-ld4fl\" (UID: \"08619f25-e76b-45d3-ab4b-8e9490d505f9\") " pod="openshift-ingress-canary/ingress-canary-ld4fl" Apr 20 21:13:43.243596 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.243474 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r26xc\" (UniqueName: \"kubernetes.io/projected/096e06e8-bf34-462d-9f43-fd87848fd09e-kube-api-access-r26xc\") pod \"dns-default-cqwmd\" (UID: \"096e06e8-bf34-462d-9f43-fd87848fd09e\") " pod="openshift-dns/dns-default-cqwmd" Apr 20 21:13:43.243596 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.243499 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/08619f25-e76b-45d3-ab4b-8e9490d505f9-cert\") pod \"ingress-canary-ld4fl\" (UID: \"08619f25-e76b-45d3-ab4b-8e9490d505f9\") " pod="openshift-ingress-canary/ingress-canary-ld4fl" Apr 20 21:13:43.336656 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.336618 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxz7h" Apr 20 21:13:43.336841 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.336618 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fk9cw" Apr 20 21:13:43.338905 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.338879 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 21:13:43.339032 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.338883 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 21:13:43.339229 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.339209 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lxd5q\"" Apr 20 21:13:43.339430 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.339414 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 21:13:43.339854 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.339674 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-fs7p8\"" Apr 20 21:13:43.344655 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.344629 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r26xc\" (UniqueName: \"kubernetes.io/projected/096e06e8-bf34-462d-9f43-fd87848fd09e-kube-api-access-r26xc\") pod \"dns-default-cqwmd\" (UID: \"096e06e8-bf34-462d-9f43-fd87848fd09e\") " pod="openshift-dns/dns-default-cqwmd" Apr 20 21:13:43.344792 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.344667 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/08619f25-e76b-45d3-ab4b-8e9490d505f9-cert\") pod \"ingress-canary-ld4fl\" (UID: \"08619f25-e76b-45d3-ab4b-8e9490d505f9\") " pod="openshift-ingress-canary/ingress-canary-ld4fl" Apr 20 21:13:43.344792 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.344709 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/096e06e8-bf34-462d-9f43-fd87848fd09e-tmp-dir\") pod \"dns-default-cqwmd\" (UID: \"096e06e8-bf34-462d-9f43-fd87848fd09e\") " pod="openshift-dns/dns-default-cqwmd" Apr 20 21:13:43.344792 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.344746 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/096e06e8-bf34-462d-9f43-fd87848fd09e-config-volume\") pod \"dns-default-cqwmd\" (UID: \"096e06e8-bf34-462d-9f43-fd87848fd09e\") " pod="openshift-dns/dns-default-cqwmd" Apr 20 21:13:43.344792 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.344769 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/096e06e8-bf34-462d-9f43-fd87848fd09e-metrics-tls\") pod \"dns-default-cqwmd\" (UID: \"096e06e8-bf34-462d-9f43-fd87848fd09e\") " pod="openshift-dns/dns-default-cqwmd" Apr 20 21:13:43.344993 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.344812 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pm4xp\" (UniqueName: \"kubernetes.io/projected/08619f25-e76b-45d3-ab4b-8e9490d505f9-kube-api-access-pm4xp\") pod \"ingress-canary-ld4fl\" (UID: \"08619f25-e76b-45d3-ab4b-8e9490d505f9\") " pod="openshift-ingress-canary/ingress-canary-ld4fl" Apr 20 21:13:43.345436 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.345407 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/096e06e8-bf34-462d-9f43-fd87848fd09e-tmp-dir\") pod \"dns-default-cqwmd\" (UID: \"096e06e8-bf34-462d-9f43-fd87848fd09e\") " pod="openshift-dns/dns-default-cqwmd" Apr 20 21:13:43.345688 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:43.345673 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 21:13:43.345749 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:43.345729 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08619f25-e76b-45d3-ab4b-8e9490d505f9-cert podName:08619f25-e76b-45d3-ab4b-8e9490d505f9 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:43.845712628 +0000 UTC m=+34.088671726 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/08619f25-e76b-45d3-ab4b-8e9490d505f9-cert") pod "ingress-canary-ld4fl" (UID: "08619f25-e76b-45d3-ab4b-8e9490d505f9") : secret "canary-serving-cert" not found Apr 20 21:13:43.345806 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.345755 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/096e06e8-bf34-462d-9f43-fd87848fd09e-config-volume\") pod \"dns-default-cqwmd\" (UID: \"096e06e8-bf34-462d-9f43-fd87848fd09e\") " pod="openshift-dns/dns-default-cqwmd" Apr 20 21:13:43.345882 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:43.345862 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 21:13:43.345993 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:43.345926 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/096e06e8-bf34-462d-9f43-fd87848fd09e-metrics-tls podName:096e06e8-bf34-462d-9f43-fd87848fd09e nodeName:}" failed. No retries permitted until 2026-04-20 21:13:43.845908901 +0000 UTC m=+34.088868012 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/096e06e8-bf34-462d-9f43-fd87848fd09e-metrics-tls") pod "dns-default-cqwmd" (UID: "096e06e8-bf34-462d-9f43-fd87848fd09e") : secret "dns-default-metrics-tls" not found Apr 20 21:13:43.355250 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.355223 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r26xc\" (UniqueName: \"kubernetes.io/projected/096e06e8-bf34-462d-9f43-fd87848fd09e-kube-api-access-r26xc\") pod \"dns-default-cqwmd\" (UID: \"096e06e8-bf34-462d-9f43-fd87848fd09e\") " pod="openshift-dns/dns-default-cqwmd" Apr 20 21:13:43.355381 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.355348 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm4xp\" (UniqueName: \"kubernetes.io/projected/08619f25-e76b-45d3-ab4b-8e9490d505f9-kube-api-access-pm4xp\") pod \"ingress-canary-ld4fl\" (UID: \"08619f25-e76b-45d3-ab4b-8e9490d505f9\") " pod="openshift-ingress-canary/ingress-canary-ld4fl" Apr 20 21:13:43.849580 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.849537 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/08619f25-e76b-45d3-ab4b-8e9490d505f9-cert\") pod \"ingress-canary-ld4fl\" (UID: \"08619f25-e76b-45d3-ab4b-8e9490d505f9\") " pod="openshift-ingress-canary/ingress-canary-ld4fl" Apr 20 21:13:43.849769 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.849604 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/096e06e8-bf34-462d-9f43-fd87848fd09e-metrics-tls\") pod \"dns-default-cqwmd\" (UID: \"096e06e8-bf34-462d-9f43-fd87848fd09e\") " pod="openshift-dns/dns-default-cqwmd" Apr 20 21:13:43.849769 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:43.849717 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 21:13:43.849769 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:43.849750 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 21:13:43.849900 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:43.849803 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08619f25-e76b-45d3-ab4b-8e9490d505f9-cert podName:08619f25-e76b-45d3-ab4b-8e9490d505f9 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:44.849777787 +0000 UTC m=+35.092736891 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/08619f25-e76b-45d3-ab4b-8e9490d505f9-cert") pod "ingress-canary-ld4fl" (UID: "08619f25-e76b-45d3-ab4b-8e9490d505f9") : secret "canary-serving-cert" not found Apr 20 21:13:43.849900 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:43.849824 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/096e06e8-bf34-462d-9f43-fd87848fd09e-metrics-tls podName:096e06e8-bf34-462d-9f43-fd87848fd09e nodeName:}" failed. No retries permitted until 2026-04-20 21:13:44.849813732 +0000 UTC m=+35.092772831 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/096e06e8-bf34-462d-9f43-fd87848fd09e-metrics-tls") pod "dns-default-cqwmd" (UID: "096e06e8-bf34-462d-9f43-fd87848fd09e") : secret "dns-default-metrics-tls" not found Apr 20 21:13:43.953593 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:43.951191 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89e3c54c-a866-4c9b-940d-54a417b5c964-metrics-certs\") pod \"network-metrics-daemon-fk9cw\" (UID: \"89e3c54c-a866-4c9b-940d-54a417b5c964\") " pod="openshift-multus/network-metrics-daemon-fk9cw" Apr 20 21:13:43.953593 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:43.951336 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 21:13:43.953593 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:43.951394 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89e3c54c-a866-4c9b-940d-54a417b5c964-metrics-certs podName:89e3c54c-a866-4c9b-940d-54a417b5c964 nodeName:}" failed. No retries permitted until 2026-04-20 21:14:15.951379561 +0000 UTC m=+66.194338659 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/89e3c54c-a866-4c9b-940d-54a417b5c964-metrics-certs") pod "network-metrics-daemon-fk9cw" (UID: "89e3c54c-a866-4c9b-940d-54a417b5c964") : secret "metrics-daemon-secret" not found Apr 20 21:13:44.051971 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:44.051930 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stvbx\" (UniqueName: \"kubernetes.io/projected/369b8c8d-720a-4d32-a69a-64bd50a8103a-kube-api-access-stvbx\") pod \"network-check-target-cxz7h\" (UID: \"369b8c8d-720a-4d32-a69a-64bd50a8103a\") " pod="openshift-network-diagnostics/network-check-target-cxz7h" Apr 20 21:13:44.055767 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:44.055735 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-stvbx\" (UniqueName: \"kubernetes.io/projected/369b8c8d-720a-4d32-a69a-64bd50a8103a-kube-api-access-stvbx\") pod \"network-check-target-cxz7h\" (UID: \"369b8c8d-720a-4d32-a69a-64bd50a8103a\") " pod="openshift-network-diagnostics/network-check-target-cxz7h" Apr 20 21:13:44.248472 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:44.248437 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cxz7h" Apr 20 21:13:44.379349 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:44.379316 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-cxz7h"] Apr 20 21:13:44.435968 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:13:44.435931 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod369b8c8d_720a_4d32_a69a_64bd50a8103a.slice/crio-d1c6f6f6ef44d4dcf177decb1f411c2d8ab2b52fad0a9238f2e6a2af26e91a2d WatchSource:0}: Error finding container d1c6f6f6ef44d4dcf177decb1f411c2d8ab2b52fad0a9238f2e6a2af26e91a2d: Status 404 returned error can't find the container with id d1c6f6f6ef44d4dcf177decb1f411c2d8ab2b52fad0a9238f2e6a2af26e91a2d Apr 20 21:13:44.584704 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:44.584670 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-cxz7h" event={"ID":"369b8c8d-720a-4d32-a69a-64bd50a8103a","Type":"ContainerStarted","Data":"d1c6f6f6ef44d4dcf177decb1f411c2d8ab2b52fad0a9238f2e6a2af26e91a2d"} Apr 20 21:13:44.857856 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:44.857763 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/08619f25-e76b-45d3-ab4b-8e9490d505f9-cert\") pod \"ingress-canary-ld4fl\" (UID: \"08619f25-e76b-45d3-ab4b-8e9490d505f9\") " pod="openshift-ingress-canary/ingress-canary-ld4fl" Apr 20 21:13:44.857856 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:44.857805 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/096e06e8-bf34-462d-9f43-fd87848fd09e-metrics-tls\") pod \"dns-default-cqwmd\" (UID: \"096e06e8-bf34-462d-9f43-fd87848fd09e\") " pod="openshift-dns/dns-default-cqwmd" Apr 20 21:13:44.858080 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:44.857922 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 21:13:44.858080 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:44.857926 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 21:13:44.858080 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:44.857984 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/096e06e8-bf34-462d-9f43-fd87848fd09e-metrics-tls podName:096e06e8-bf34-462d-9f43-fd87848fd09e nodeName:}" failed. No retries permitted until 2026-04-20 21:13:46.857970073 +0000 UTC m=+37.100929176 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/096e06e8-bf34-462d-9f43-fd87848fd09e-metrics-tls") pod "dns-default-cqwmd" (UID: "096e06e8-bf34-462d-9f43-fd87848fd09e") : secret "dns-default-metrics-tls" not found Apr 20 21:13:44.858080 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:44.857999 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08619f25-e76b-45d3-ab4b-8e9490d505f9-cert podName:08619f25-e76b-45d3-ab4b-8e9490d505f9 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:46.857993343 +0000 UTC m=+37.100952440 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/08619f25-e76b-45d3-ab4b-8e9490d505f9-cert") pod "ingress-canary-ld4fl" (UID: "08619f25-e76b-45d3-ab4b-8e9490d505f9") : secret "canary-serving-cert" not found Apr 20 21:13:45.590004 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:45.589969 2571 generic.go:358] "Generic (PLEG): container finished" podID="6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9" containerID="5dacbded419855f73a4c388bfd1e49fc028d214e69beae4641c67dd4eb0a0a7b" exitCode=0 Apr 20 21:13:45.590004 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:45.590016 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rhd4c" event={"ID":"6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9","Type":"ContainerDied","Data":"5dacbded419855f73a4c388bfd1e49fc028d214e69beae4641c67dd4eb0a0a7b"} Apr 20 21:13:46.596027 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:46.595839 2571 generic.go:358] "Generic (PLEG): container finished" podID="6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9" containerID="fa7ad904b552c4f73197ad5f2359f01ab121d1afe3f6df0474f23162bf9d77b2" exitCode=0 Apr 20 21:13:46.596500 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:46.595918 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rhd4c" event={"ID":"6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9","Type":"ContainerDied","Data":"fa7ad904b552c4f73197ad5f2359f01ab121d1afe3f6df0474f23162bf9d77b2"} Apr 20 21:13:46.875815 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:46.875734 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/08619f25-e76b-45d3-ab4b-8e9490d505f9-cert\") pod \"ingress-canary-ld4fl\" (UID: \"08619f25-e76b-45d3-ab4b-8e9490d505f9\") " pod="openshift-ingress-canary/ingress-canary-ld4fl" Apr 20 21:13:46.875815 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:46.875780 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/096e06e8-bf34-462d-9f43-fd87848fd09e-metrics-tls\") pod \"dns-default-cqwmd\" (UID: \"096e06e8-bf34-462d-9f43-fd87848fd09e\") " pod="openshift-dns/dns-default-cqwmd" Apr 20 21:13:46.876040 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:46.875953 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 21:13:46.876040 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:46.875985 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 21:13:46.876040 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:46.876037 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08619f25-e76b-45d3-ab4b-8e9490d505f9-cert podName:08619f25-e76b-45d3-ab4b-8e9490d505f9 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:50.876015853 +0000 UTC m=+41.118974952 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/08619f25-e76b-45d3-ab4b-8e9490d505f9-cert") pod "ingress-canary-ld4fl" (UID: "08619f25-e76b-45d3-ab4b-8e9490d505f9") : secret "canary-serving-cert" not found Apr 20 21:13:46.876156 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:46.876056 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/096e06e8-bf34-462d-9f43-fd87848fd09e-metrics-tls podName:096e06e8-bf34-462d-9f43-fd87848fd09e nodeName:}" failed. No retries permitted until 2026-04-20 21:13:50.876046622 +0000 UTC m=+41.119005727 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/096e06e8-bf34-462d-9f43-fd87848fd09e-metrics-tls") pod "dns-default-cqwmd" (UID: "096e06e8-bf34-462d-9f43-fd87848fd09e") : secret "dns-default-metrics-tls" not found Apr 20 21:13:47.598806 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:47.598772 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-cxz7h" event={"ID":"369b8c8d-720a-4d32-a69a-64bd50a8103a","Type":"ContainerStarted","Data":"aa5b5c0b68bac1e0b7adda972a79246e533c4d7eb064fbfd2b1b1ed1b84e5940"} Apr 20 21:13:47.599389 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:47.598874 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-cxz7h" Apr 20 21:13:47.601712 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:47.601689 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rhd4c" event={"ID":"6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9","Type":"ContainerStarted","Data":"6acf790af5ec63dbe006d743517b23136d598dfecd09c18a9b4a5bee0b08bf4a"} Apr 20 21:13:47.614274 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:47.614222 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-cxz7h" podStartSLOduration=34.725321013 podStartE2EDuration="37.614205964s" podCreationTimestamp="2026-04-20 21:13:10 +0000 UTC" firstStartedPulling="2026-04-20 21:13:44.448420981 +0000 UTC m=+34.691380079" lastFinishedPulling="2026-04-20 21:13:47.337305932 +0000 UTC m=+37.580265030" observedRunningTime="2026-04-20 21:13:47.613106876 +0000 UTC m=+37.856065996" watchObservedRunningTime="2026-04-20 21:13:47.614205964 +0000 UTC m=+37.857165085" Apr 20 21:13:47.635063 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:47.635025 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-rhd4c" podStartSLOduration=6.025397486 podStartE2EDuration="37.635014731s" podCreationTimestamp="2026-04-20 21:13:10 +0000 UTC" firstStartedPulling="2026-04-20 21:13:12.861467374 +0000 UTC m=+3.104426471" lastFinishedPulling="2026-04-20 21:13:44.471084604 +0000 UTC m=+34.714043716" observedRunningTime="2026-04-20 21:13:47.634414574 +0000 UTC m=+37.877373695" watchObservedRunningTime="2026-04-20 21:13:47.635014731 +0000 UTC m=+37.877973851" Apr 20 21:13:50.904943 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:50.904914 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/08619f25-e76b-45d3-ab4b-8e9490d505f9-cert\") pod \"ingress-canary-ld4fl\" (UID: \"08619f25-e76b-45d3-ab4b-8e9490d505f9\") " pod="openshift-ingress-canary/ingress-canary-ld4fl" Apr 20 21:13:50.905465 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:50.904957 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/096e06e8-bf34-462d-9f43-fd87848fd09e-metrics-tls\") pod \"dns-default-cqwmd\" (UID: \"096e06e8-bf34-462d-9f43-fd87848fd09e\") " pod="openshift-dns/dns-default-cqwmd" Apr 20 21:13:50.905465 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:50.905057 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 21:13:50.905465 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:50.905064 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 21:13:50.905465 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:50.905115 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08619f25-e76b-45d3-ab4b-8e9490d505f9-cert podName:08619f25-e76b-45d3-ab4b-8e9490d505f9 nodeName:}" failed. No retries permitted until 2026-04-20 21:13:58.905101974 +0000 UTC m=+49.148061073 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/08619f25-e76b-45d3-ab4b-8e9490d505f9-cert") pod "ingress-canary-ld4fl" (UID: "08619f25-e76b-45d3-ab4b-8e9490d505f9") : secret "canary-serving-cert" not found Apr 20 21:13:50.905465 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:50.905129 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/096e06e8-bf34-462d-9f43-fd87848fd09e-metrics-tls podName:096e06e8-bf34-462d-9f43-fd87848fd09e nodeName:}" failed. No retries permitted until 2026-04-20 21:13:58.905122697 +0000 UTC m=+49.148081795 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/096e06e8-bf34-462d-9f43-fd87848fd09e-metrics-tls") pod "dns-default-cqwmd" (UID: "096e06e8-bf34-462d-9f43-fd87848fd09e") : secret "dns-default-metrics-tls" not found Apr 20 21:13:58.961429 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:58.961391 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/096e06e8-bf34-462d-9f43-fd87848fd09e-metrics-tls\") pod \"dns-default-cqwmd\" (UID: \"096e06e8-bf34-462d-9f43-fd87848fd09e\") " pod="openshift-dns/dns-default-cqwmd" Apr 20 21:13:58.961896 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:13:58.961451 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/08619f25-e76b-45d3-ab4b-8e9490d505f9-cert\") pod \"ingress-canary-ld4fl\" (UID: \"08619f25-e76b-45d3-ab4b-8e9490d505f9\") " pod="openshift-ingress-canary/ingress-canary-ld4fl" Apr 20 21:13:58.961896 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:58.961527 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 21:13:58.961896 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:58.961529 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 21:13:58.961896 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:58.961586 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08619f25-e76b-45d3-ab4b-8e9490d505f9-cert podName:08619f25-e76b-45d3-ab4b-8e9490d505f9 nodeName:}" failed. No retries permitted until 2026-04-20 21:14:14.961572551 +0000 UTC m=+65.204531653 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/08619f25-e76b-45d3-ab4b-8e9490d505f9-cert") pod "ingress-canary-ld4fl" (UID: "08619f25-e76b-45d3-ab4b-8e9490d505f9") : secret "canary-serving-cert" not found Apr 20 21:13:58.961896 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:13:58.961599 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/096e06e8-bf34-462d-9f43-fd87848fd09e-metrics-tls podName:096e06e8-bf34-462d-9f43-fd87848fd09e nodeName:}" failed. No retries permitted until 2026-04-20 21:14:14.961593109 +0000 UTC m=+65.204552206 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/096e06e8-bf34-462d-9f43-fd87848fd09e-metrics-tls") pod "dns-default-cqwmd" (UID: "096e06e8-bf34-462d-9f43-fd87848fd09e") : secret "dns-default-metrics-tls" not found Apr 20 21:14:07.876573 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.876527 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-mvjf7"] Apr 20 21:14:07.883681 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.883655 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8gwlz"] Apr 20 21:14:07.883825 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.883769 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-mvjf7" Apr 20 21:14:07.885972 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.885950 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 20 21:14:07.886110 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.886004 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 20 21:14:07.886110 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.886079 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 20 21:14:07.886110 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.886093 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-xwsvd\"" Apr 20 21:14:07.886711 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.886677 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8gwlz"] Apr 20 21:14:07.886711 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.886702 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-mvjf7"] Apr 20 21:14:07.886826 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.886794 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8gwlz" Apr 20 21:14:07.888558 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.888541 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-q2s6b\"" Apr 20 21:14:07.888649 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.888545 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 20 21:14:07.888793 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.888775 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 20 21:14:07.888867 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.888801 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 20 21:14:07.888867 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.888832 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 20 21:14:07.973220 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.973173 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-s4nps"] Apr 20 21:14:07.976102 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.976083 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-hjvgw"] Apr 20 21:14:07.976248 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.976233 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-s4nps" Apr 20 21:14:07.978220 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.978202 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 20 21:14:07.978322 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.978209 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-8m4gm\"" Apr 20 21:14:07.978322 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.978240 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 20 21:14:07.978891 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.978875 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-hjvgw" Apr 20 21:14:07.980652 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.980634 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 20 21:14:07.980742 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.980640 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-wflrc\"" Apr 20 21:14:07.980865 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.980850 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 20 21:14:07.980925 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.980856 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 20 21:14:07.981000 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.980983 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 20 21:14:07.983096 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.983074 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-xxkhk"] Apr 20 21:14:07.986134 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.986109 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-79b56464d-ztxdm"] Apr 20 21:14:07.986254 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.986232 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-xxkhk" Apr 20 21:14:07.987483 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.987464 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 20 21:14:07.988222 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.988205 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 20 21:14:07.988387 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.988209 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-hdrlt\"" Apr 20 21:14:07.989242 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.989221 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 21:14:07.989440 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.989423 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-s4nps"] Apr 20 21:14:07.989656 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.989520 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-79b56464d-ztxdm" Apr 20 21:14:07.989954 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.989935 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 20 21:14:07.990320 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.990299 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-hjvgw"] Apr 20 21:14:07.990619 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.990594 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 21:14:07.991423 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.991403 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 21:14:07.992591 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.992572 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-82gsd\"" Apr 20 21:14:07.992675 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.992572 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 20 21:14:07.992975 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.992954 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 21:14:07.992975 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.992968 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 20 21:14:07.993385 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.993355 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 20 21:14:07.993474 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.993411 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 20 21:14:07.994401 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.994379 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 20 21:14:08.000105 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:07.998266 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-xxkhk"] Apr 20 21:14:08.001249 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.001229 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-79b56464d-ztxdm"] Apr 20 21:14:08.021196 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.021152 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/86ba4aaf-0c45-4967-a1d3-51755a3cd672-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-mvjf7\" (UID: \"86ba4aaf-0c45-4967-a1d3-51755a3cd672\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-mvjf7" Apr 20 21:14:08.021285 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.021222 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpbz5\" (UniqueName: \"kubernetes.io/projected/86ba4aaf-0c45-4967-a1d3-51755a3cd672-kube-api-access-kpbz5\") pod \"cluster-samples-operator-6dc5bdb6b4-mvjf7\" (UID: \"86ba4aaf-0c45-4967-a1d3-51755a3cd672\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-mvjf7" Apr 20 21:14:08.021285 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.021264 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36c761e2-ffbf-4d69-8d19-9b3793a3acf9-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-8gwlz\" (UID: \"36c761e2-ffbf-4d69-8d19-9b3793a3acf9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8gwlz" Apr 20 21:14:08.021399 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.021287 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36c761e2-ffbf-4d69-8d19-9b3793a3acf9-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-8gwlz\" (UID: \"36c761e2-ffbf-4d69-8d19-9b3793a3acf9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8gwlz" Apr 20 21:14:08.021399 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.021359 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d47g\" (UniqueName: \"kubernetes.io/projected/36c761e2-ffbf-4d69-8d19-9b3793a3acf9-kube-api-access-8d47g\") pod \"kube-storage-version-migrator-operator-6769c5d45-8gwlz\" (UID: \"36c761e2-ffbf-4d69-8d19-9b3793a3acf9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8gwlz" Apr 20 21:14:08.122257 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.122229 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kpbz5\" (UniqueName: \"kubernetes.io/projected/86ba4aaf-0c45-4967-a1d3-51755a3cd672-kube-api-access-kpbz5\") pod \"cluster-samples-operator-6dc5bdb6b4-mvjf7\" (UID: \"86ba4aaf-0c45-4967-a1d3-51755a3cd672\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-mvjf7" Apr 20 21:14:08.122393 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.122283 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/838a3bd4-1a50-4127-a629-525bfede6ffd-config\") pod \"console-operator-9d4b6777b-hjvgw\" (UID: \"838a3bd4-1a50-4127-a629-525bfede6ffd\") " pod="openshift-console-operator/console-operator-9d4b6777b-hjvgw" Apr 20 21:14:08.122393 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.122299 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsqzg\" (UniqueName: \"kubernetes.io/projected/299bba46-6418-4baf-8a89-6db7597a7bc4-kube-api-access-fsqzg\") pod \"router-default-79b56464d-ztxdm\" (UID: \"299bba46-6418-4baf-8a89-6db7597a7bc4\") " pod="openshift-ingress/router-default-79b56464d-ztxdm" Apr 20 21:14:08.122393 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.122317 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36c761e2-ffbf-4d69-8d19-9b3793a3acf9-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-8gwlz\" (UID: \"36c761e2-ffbf-4d69-8d19-9b3793a3acf9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8gwlz" Apr 20 21:14:08.122393 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.122332 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/838a3bd4-1a50-4127-a629-525bfede6ffd-serving-cert\") pod \"console-operator-9d4b6777b-hjvgw\" (UID: \"838a3bd4-1a50-4127-a629-525bfede6ffd\") " pod="openshift-console-operator/console-operator-9d4b6777b-hjvgw" Apr 20 21:14:08.122393 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.122347 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm4xx\" (UniqueName: \"kubernetes.io/projected/838a3bd4-1a50-4127-a629-525bfede6ffd-kube-api-access-dm4xx\") pod \"console-operator-9d4b6777b-hjvgw\" (UID: \"838a3bd4-1a50-4127-a629-525bfede6ffd\") " pod="openshift-console-operator/console-operator-9d4b6777b-hjvgw" Apr 20 21:14:08.122393 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.122380 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36c761e2-ffbf-4d69-8d19-9b3793a3acf9-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-8gwlz\" (UID: \"36c761e2-ffbf-4d69-8d19-9b3793a3acf9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8gwlz" Apr 20 21:14:08.122682 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.122419 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6a03f80-2426-4087-b868-a71402310e22-service-ca-bundle\") pod \"insights-operator-585dfdc468-xxkhk\" (UID: \"f6a03f80-2426-4087-b868-a71402310e22\") " pod="openshift-insights/insights-operator-585dfdc468-xxkhk" Apr 20 21:14:08.122682 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.122463 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8d47g\" (UniqueName: \"kubernetes.io/projected/36c761e2-ffbf-4d69-8d19-9b3793a3acf9-kube-api-access-8d47g\") pod \"kube-storage-version-migrator-operator-6769c5d45-8gwlz\" (UID: \"36c761e2-ffbf-4d69-8d19-9b3793a3acf9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8gwlz" Apr 20 21:14:08.122682 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.122492 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/838a3bd4-1a50-4127-a629-525bfede6ffd-trusted-ca\") pod \"console-operator-9d4b6777b-hjvgw\" (UID: \"838a3bd4-1a50-4127-a629-525bfede6ffd\") " pod="openshift-console-operator/console-operator-9d4b6777b-hjvgw" Apr 20 21:14:08.122682 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.122514 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/299bba46-6418-4baf-8a89-6db7597a7bc4-service-ca-bundle\") pod \"router-default-79b56464d-ztxdm\" (UID: \"299bba46-6418-4baf-8a89-6db7597a7bc4\") " pod="openshift-ingress/router-default-79b56464d-ztxdm" Apr 20 21:14:08.122682 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.122627 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/299bba46-6418-4baf-8a89-6db7597a7bc4-stats-auth\") pod \"router-default-79b56464d-ztxdm\" (UID: \"299bba46-6418-4baf-8a89-6db7597a7bc4\") " pod="openshift-ingress/router-default-79b56464d-ztxdm" Apr 20 21:14:08.122887 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.122701 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/86ba4aaf-0c45-4967-a1d3-51755a3cd672-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-mvjf7\" (UID: \"86ba4aaf-0c45-4967-a1d3-51755a3cd672\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-mvjf7" Apr 20 21:14:08.122887 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.122728 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/299bba46-6418-4baf-8a89-6db7597a7bc4-default-certificate\") pod \"router-default-79b56464d-ztxdm\" (UID: \"299bba46-6418-4baf-8a89-6db7597a7bc4\") " pod="openshift-ingress/router-default-79b56464d-ztxdm" Apr 20 21:14:08.122887 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.122751 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f6a03f80-2426-4087-b868-a71402310e22-tmp\") pod \"insights-operator-585dfdc468-xxkhk\" (UID: \"f6a03f80-2426-4087-b868-a71402310e22\") " pod="openshift-insights/insights-operator-585dfdc468-xxkhk" Apr 20 21:14:08.122887 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.122776 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/f6a03f80-2426-4087-b868-a71402310e22-snapshots\") pod \"insights-operator-585dfdc468-xxkhk\" (UID: \"f6a03f80-2426-4087-b868-a71402310e22\") " pod="openshift-insights/insights-operator-585dfdc468-xxkhk" Apr 20 21:14:08.122887 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:08.122787 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 21:14:08.122887 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:08.122852 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86ba4aaf-0c45-4967-a1d3-51755a3cd672-samples-operator-tls podName:86ba4aaf-0c45-4967-a1d3-51755a3cd672 nodeName:}" failed. No retries permitted until 2026-04-20 21:14:08.622837165 +0000 UTC m=+58.865796263 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/86ba4aaf-0c45-4967-a1d3-51755a3cd672-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-mvjf7" (UID: "86ba4aaf-0c45-4967-a1d3-51755a3cd672") : secret "samples-operator-tls" not found Apr 20 21:14:08.122887 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.122793 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/299bba46-6418-4baf-8a89-6db7597a7bc4-metrics-certs\") pod \"router-default-79b56464d-ztxdm\" (UID: \"299bba46-6418-4baf-8a89-6db7597a7bc4\") " pod="openshift-ingress/router-default-79b56464d-ztxdm" Apr 20 21:14:08.123172 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.122892 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6a03f80-2426-4087-b868-a71402310e22-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-xxkhk\" (UID: \"f6a03f80-2426-4087-b868-a71402310e22\") " pod="openshift-insights/insights-operator-585dfdc468-xxkhk" Apr 20 21:14:08.123172 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.122894 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36c761e2-ffbf-4d69-8d19-9b3793a3acf9-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-8gwlz\" (UID: \"36c761e2-ffbf-4d69-8d19-9b3793a3acf9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8gwlz" Apr 20 21:14:08.123172 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.122910 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzl6p\" (UniqueName: \"kubernetes.io/projected/f6a03f80-2426-4087-b868-a71402310e22-kube-api-access-lzl6p\") pod \"insights-operator-585dfdc468-xxkhk\" (UID: \"f6a03f80-2426-4087-b868-a71402310e22\") " pod="openshift-insights/insights-operator-585dfdc468-xxkhk" Apr 20 21:14:08.123172 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.122929 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bps8\" (UniqueName: \"kubernetes.io/projected/6500e95b-38f5-4c5c-b0f3-f38cf82ffcb6-kube-api-access-2bps8\") pod \"volume-data-source-validator-7c6cbb6c87-s4nps\" (UID: \"6500e95b-38f5-4c5c-b0f3-f38cf82ffcb6\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-s4nps" Apr 20 21:14:08.123172 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.122959 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6a03f80-2426-4087-b868-a71402310e22-serving-cert\") pod \"insights-operator-585dfdc468-xxkhk\" (UID: \"f6a03f80-2426-4087-b868-a71402310e22\") " pod="openshift-insights/insights-operator-585dfdc468-xxkhk" Apr 20 21:14:08.126153 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.126129 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36c761e2-ffbf-4d69-8d19-9b3793a3acf9-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-8gwlz\" (UID: \"36c761e2-ffbf-4d69-8d19-9b3793a3acf9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8gwlz" Apr 20 21:14:08.134060 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.134013 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d47g\" (UniqueName: \"kubernetes.io/projected/36c761e2-ffbf-4d69-8d19-9b3793a3acf9-kube-api-access-8d47g\") pod \"kube-storage-version-migrator-operator-6769c5d45-8gwlz\" (UID: \"36c761e2-ffbf-4d69-8d19-9b3793a3acf9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8gwlz" Apr 20 21:14:08.134470 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.134453 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpbz5\" (UniqueName: \"kubernetes.io/projected/86ba4aaf-0c45-4967-a1d3-51755a3cd672-kube-api-access-kpbz5\") pod \"cluster-samples-operator-6dc5bdb6b4-mvjf7\" (UID: \"86ba4aaf-0c45-4967-a1d3-51755a3cd672\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-mvjf7" Apr 20 21:14:08.201494 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.201470 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8gwlz" Apr 20 21:14:08.224356 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.224328 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/838a3bd4-1a50-4127-a629-525bfede6ffd-config\") pod \"console-operator-9d4b6777b-hjvgw\" (UID: \"838a3bd4-1a50-4127-a629-525bfede6ffd\") " pod="openshift-console-operator/console-operator-9d4b6777b-hjvgw" Apr 20 21:14:08.224448 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.224372 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsqzg\" (UniqueName: \"kubernetes.io/projected/299bba46-6418-4baf-8a89-6db7597a7bc4-kube-api-access-fsqzg\") pod \"router-default-79b56464d-ztxdm\" (UID: \"299bba46-6418-4baf-8a89-6db7597a7bc4\") " pod="openshift-ingress/router-default-79b56464d-ztxdm" Apr 20 21:14:08.224448 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.224401 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/838a3bd4-1a50-4127-a629-525bfede6ffd-serving-cert\") pod \"console-operator-9d4b6777b-hjvgw\" (UID: \"838a3bd4-1a50-4127-a629-525bfede6ffd\") " pod="openshift-console-operator/console-operator-9d4b6777b-hjvgw" Apr 20 21:14:08.224448 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.224426 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dm4xx\" (UniqueName: \"kubernetes.io/projected/838a3bd4-1a50-4127-a629-525bfede6ffd-kube-api-access-dm4xx\") pod \"console-operator-9d4b6777b-hjvgw\" (UID: \"838a3bd4-1a50-4127-a629-525bfede6ffd\") " pod="openshift-console-operator/console-operator-9d4b6777b-hjvgw" Apr 20 21:14:08.224586 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.224458 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6a03f80-2426-4087-b868-a71402310e22-service-ca-bundle\") pod \"insights-operator-585dfdc468-xxkhk\" (UID: \"f6a03f80-2426-4087-b868-a71402310e22\") " pod="openshift-insights/insights-operator-585dfdc468-xxkhk" Apr 20 21:14:08.224685 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.224662 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/838a3bd4-1a50-4127-a629-525bfede6ffd-trusted-ca\") pod \"console-operator-9d4b6777b-hjvgw\" (UID: \"838a3bd4-1a50-4127-a629-525bfede6ffd\") " pod="openshift-console-operator/console-operator-9d4b6777b-hjvgw" Apr 20 21:14:08.224761 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.224705 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/299bba46-6418-4baf-8a89-6db7597a7bc4-service-ca-bundle\") pod \"router-default-79b56464d-ztxdm\" (UID: \"299bba46-6418-4baf-8a89-6db7597a7bc4\") " pod="openshift-ingress/router-default-79b56464d-ztxdm" Apr 20 21:14:08.224761 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.224734 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/299bba46-6418-4baf-8a89-6db7597a7bc4-stats-auth\") pod \"router-default-79b56464d-ztxdm\" (UID: \"299bba46-6418-4baf-8a89-6db7597a7bc4\") " pod="openshift-ingress/router-default-79b56464d-ztxdm" Apr 20 21:14:08.224857 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.224788 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/299bba46-6418-4baf-8a89-6db7597a7bc4-default-certificate\") pod \"router-default-79b56464d-ztxdm\" (UID: \"299bba46-6418-4baf-8a89-6db7597a7bc4\") " pod="openshift-ingress/router-default-79b56464d-ztxdm" Apr 20 21:14:08.224857 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.224817 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f6a03f80-2426-4087-b868-a71402310e22-tmp\") pod \"insights-operator-585dfdc468-xxkhk\" (UID: \"f6a03f80-2426-4087-b868-a71402310e22\") " pod="openshift-insights/insights-operator-585dfdc468-xxkhk" Apr 20 21:14:08.224857 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.224845 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/f6a03f80-2426-4087-b868-a71402310e22-snapshots\") pod \"insights-operator-585dfdc468-xxkhk\" (UID: \"f6a03f80-2426-4087-b868-a71402310e22\") " pod="openshift-insights/insights-operator-585dfdc468-xxkhk" Apr 20 21:14:08.224994 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:08.224869 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/299bba46-6418-4baf-8a89-6db7597a7bc4-service-ca-bundle podName:299bba46-6418-4baf-8a89-6db7597a7bc4 nodeName:}" failed. No retries permitted until 2026-04-20 21:14:08.724844066 +0000 UTC m=+58.967803189 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/299bba46-6418-4baf-8a89-6db7597a7bc4-service-ca-bundle") pod "router-default-79b56464d-ztxdm" (UID: "299bba46-6418-4baf-8a89-6db7597a7bc4") : configmap references non-existent config key: service-ca.crt Apr 20 21:14:08.224994 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.224916 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/299bba46-6418-4baf-8a89-6db7597a7bc4-metrics-certs\") pod \"router-default-79b56464d-ztxdm\" (UID: \"299bba46-6418-4baf-8a89-6db7597a7bc4\") " pod="openshift-ingress/router-default-79b56464d-ztxdm" Apr 20 21:14:08.224994 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.224944 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6a03f80-2426-4087-b868-a71402310e22-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-xxkhk\" (UID: \"f6a03f80-2426-4087-b868-a71402310e22\") " pod="openshift-insights/insights-operator-585dfdc468-xxkhk" Apr 20 21:14:08.224994 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.224968 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lzl6p\" (UniqueName: \"kubernetes.io/projected/f6a03f80-2426-4087-b868-a71402310e22-kube-api-access-lzl6p\") pod \"insights-operator-585dfdc468-xxkhk\" (UID: \"f6a03f80-2426-4087-b868-a71402310e22\") " pod="openshift-insights/insights-operator-585dfdc468-xxkhk" Apr 20 21:14:08.225160 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.224994 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bps8\" (UniqueName: \"kubernetes.io/projected/6500e95b-38f5-4c5c-b0f3-f38cf82ffcb6-kube-api-access-2bps8\") pod \"volume-data-source-validator-7c6cbb6c87-s4nps\" (UID: \"6500e95b-38f5-4c5c-b0f3-f38cf82ffcb6\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-s4nps" Apr 20 21:14:08.225160 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.225029 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6a03f80-2426-4087-b868-a71402310e22-serving-cert\") pod \"insights-operator-585dfdc468-xxkhk\" (UID: \"f6a03f80-2426-4087-b868-a71402310e22\") " pod="openshift-insights/insights-operator-585dfdc468-xxkhk" Apr 20 21:14:08.225160 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.225070 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6a03f80-2426-4087-b868-a71402310e22-service-ca-bundle\") pod \"insights-operator-585dfdc468-xxkhk\" (UID: \"f6a03f80-2426-4087-b868-a71402310e22\") " pod="openshift-insights/insights-operator-585dfdc468-xxkhk" Apr 20 21:14:08.225332 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.225277 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/838a3bd4-1a50-4127-a629-525bfede6ffd-config\") pod \"console-operator-9d4b6777b-hjvgw\" (UID: \"838a3bd4-1a50-4127-a629-525bfede6ffd\") " pod="openshift-console-operator/console-operator-9d4b6777b-hjvgw" Apr 20 21:14:08.225480 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.225456 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/f6a03f80-2426-4087-b868-a71402310e22-snapshots\") pod \"insights-operator-585dfdc468-xxkhk\" (UID: \"f6a03f80-2426-4087-b868-a71402310e22\") " pod="openshift-insights/insights-operator-585dfdc468-xxkhk" Apr 20 21:14:08.225585 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.225562 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f6a03f80-2426-4087-b868-a71402310e22-tmp\") pod \"insights-operator-585dfdc468-xxkhk\" (UID: \"f6a03f80-2426-4087-b868-a71402310e22\") " pod="openshift-insights/insights-operator-585dfdc468-xxkhk" Apr 20 21:14:08.225669 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:08.225652 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 21:14:08.225742 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:08.225730 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/299bba46-6418-4baf-8a89-6db7597a7bc4-metrics-certs podName:299bba46-6418-4baf-8a89-6db7597a7bc4 nodeName:}" failed. No retries permitted until 2026-04-20 21:14:08.725713555 +0000 UTC m=+58.968672662 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/299bba46-6418-4baf-8a89-6db7597a7bc4-metrics-certs") pod "router-default-79b56464d-ztxdm" (UID: "299bba46-6418-4baf-8a89-6db7597a7bc4") : secret "router-metrics-certs-default" not found Apr 20 21:14:08.226152 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.226087 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/838a3bd4-1a50-4127-a629-525bfede6ffd-trusted-ca\") pod \"console-operator-9d4b6777b-hjvgw\" (UID: \"838a3bd4-1a50-4127-a629-525bfede6ffd\") " pod="openshift-console-operator/console-operator-9d4b6777b-hjvgw" Apr 20 21:14:08.226338 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.226320 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6a03f80-2426-4087-b868-a71402310e22-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-xxkhk\" (UID: \"f6a03f80-2426-4087-b868-a71402310e22\") " pod="openshift-insights/insights-operator-585dfdc468-xxkhk" Apr 20 21:14:08.227556 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.227532 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/299bba46-6418-4baf-8a89-6db7597a7bc4-stats-auth\") pod \"router-default-79b56464d-ztxdm\" (UID: \"299bba46-6418-4baf-8a89-6db7597a7bc4\") " pod="openshift-ingress/router-default-79b56464d-ztxdm" Apr 20 21:14:08.227857 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.227840 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/838a3bd4-1a50-4127-a629-525bfede6ffd-serving-cert\") pod \"console-operator-9d4b6777b-hjvgw\" (UID: \"838a3bd4-1a50-4127-a629-525bfede6ffd\") " pod="openshift-console-operator/console-operator-9d4b6777b-hjvgw" Apr 20 21:14:08.228139 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.228124 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/299bba46-6418-4baf-8a89-6db7597a7bc4-default-certificate\") pod \"router-default-79b56464d-ztxdm\" (UID: \"299bba46-6418-4baf-8a89-6db7597a7bc4\") " pod="openshift-ingress/router-default-79b56464d-ztxdm" Apr 20 21:14:08.228308 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.228292 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6a03f80-2426-4087-b868-a71402310e22-serving-cert\") pod \"insights-operator-585dfdc468-xxkhk\" (UID: \"f6a03f80-2426-4087-b868-a71402310e22\") " pod="openshift-insights/insights-operator-585dfdc468-xxkhk" Apr 20 21:14:08.232517 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.232497 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsqzg\" (UniqueName: \"kubernetes.io/projected/299bba46-6418-4baf-8a89-6db7597a7bc4-kube-api-access-fsqzg\") pod \"router-default-79b56464d-ztxdm\" (UID: \"299bba46-6418-4baf-8a89-6db7597a7bc4\") " pod="openshift-ingress/router-default-79b56464d-ztxdm" Apr 20 21:14:08.232607 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.232592 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm4xx\" (UniqueName: \"kubernetes.io/projected/838a3bd4-1a50-4127-a629-525bfede6ffd-kube-api-access-dm4xx\") pod \"console-operator-9d4b6777b-hjvgw\" (UID: \"838a3bd4-1a50-4127-a629-525bfede6ffd\") " pod="openshift-console-operator/console-operator-9d4b6777b-hjvgw" Apr 20 21:14:08.237689 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.237642 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bps8\" (UniqueName: \"kubernetes.io/projected/6500e95b-38f5-4c5c-b0f3-f38cf82ffcb6-kube-api-access-2bps8\") pod \"volume-data-source-validator-7c6cbb6c87-s4nps\" (UID: \"6500e95b-38f5-4c5c-b0f3-f38cf82ffcb6\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-s4nps" Apr 20 21:14:08.237801 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.237782 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzl6p\" (UniqueName: \"kubernetes.io/projected/f6a03f80-2426-4087-b868-a71402310e22-kube-api-access-lzl6p\") pod \"insights-operator-585dfdc468-xxkhk\" (UID: \"f6a03f80-2426-4087-b868-a71402310e22\") " pod="openshift-insights/insights-operator-585dfdc468-xxkhk" Apr 20 21:14:08.286549 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.286522 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-s4nps" Apr 20 21:14:08.294107 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.294080 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-hjvgw" Apr 20 21:14:08.303851 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.303829 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-xxkhk" Apr 20 21:14:08.317950 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.317922 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8gwlz"] Apr 20 21:14:08.320887 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:14:08.320852 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36c761e2_ffbf_4d69_8d19_9b3793a3acf9.slice/crio-b48daa70e09d43160d94f5a29f4ac8a2f97393d26e6a7a1282d3f76d48dda02f WatchSource:0}: Error finding container b48daa70e09d43160d94f5a29f4ac8a2f97393d26e6a7a1282d3f76d48dda02f: Status 404 returned error can't find the container with id b48daa70e09d43160d94f5a29f4ac8a2f97393d26e6a7a1282d3f76d48dda02f Apr 20 21:14:08.434353 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.434326 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-s4nps"] Apr 20 21:14:08.438160 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:14:08.438132 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6500e95b_38f5_4c5c_b0f3_f38cf82ffcb6.slice/crio-b615050ebddd487efda88499e7602cbe42758a53f255aa289f52f330656fa7be WatchSource:0}: Error finding container b615050ebddd487efda88499e7602cbe42758a53f255aa289f52f330656fa7be: Status 404 returned error can't find the container with id b615050ebddd487efda88499e7602cbe42758a53f255aa289f52f330656fa7be Apr 20 21:14:08.468204 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.468157 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-hjvgw"] Apr 20 21:14:08.468315 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:14:08.468265 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod838a3bd4_1a50_4127_a629_525bfede6ffd.slice/crio-58b6fce0e9ccf898b532f029eaf17e725741eda4a919305ee14abed2fa35dee9 WatchSource:0}: Error finding container 58b6fce0e9ccf898b532f029eaf17e725741eda4a919305ee14abed2fa35dee9: Status 404 returned error can't find the container with id 58b6fce0e9ccf898b532f029eaf17e725741eda4a919305ee14abed2fa35dee9 Apr 20 21:14:08.491602 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.491580 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-xxkhk"] Apr 20 21:14:08.495415 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:14:08.495394 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6a03f80_2426_4087_b868_a71402310e22.slice/crio-379468ccae20f5a7c99e4947b0bf0b8cbccf7280d04d6d19b27682ea15f9ffb1 WatchSource:0}: Error finding container 379468ccae20f5a7c99e4947b0bf0b8cbccf7280d04d6d19b27682ea15f9ffb1: Status 404 returned error can't find the container with id 379468ccae20f5a7c99e4947b0bf0b8cbccf7280d04d6d19b27682ea15f9ffb1 Apr 20 21:14:08.628279 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.628248 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/86ba4aaf-0c45-4967-a1d3-51755a3cd672-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-mvjf7\" (UID: \"86ba4aaf-0c45-4967-a1d3-51755a3cd672\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-mvjf7" Apr 20 21:14:08.628404 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:08.628365 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 21:14:08.628457 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:08.628425 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86ba4aaf-0c45-4967-a1d3-51755a3cd672-samples-operator-tls podName:86ba4aaf-0c45-4967-a1d3-51755a3cd672 nodeName:}" failed. No retries permitted until 2026-04-20 21:14:09.628407316 +0000 UTC m=+59.871366421 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/86ba4aaf-0c45-4967-a1d3-51755a3cd672-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-mvjf7" (UID: "86ba4aaf-0c45-4967-a1d3-51755a3cd672") : secret "samples-operator-tls" not found Apr 20 21:14:08.640916 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.640885 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-xxkhk" event={"ID":"f6a03f80-2426-4087-b868-a71402310e22","Type":"ContainerStarted","Data":"379468ccae20f5a7c99e4947b0bf0b8cbccf7280d04d6d19b27682ea15f9ffb1"} Apr 20 21:14:08.641854 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.641831 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8gwlz" event={"ID":"36c761e2-ffbf-4d69-8d19-9b3793a3acf9","Type":"ContainerStarted","Data":"b48daa70e09d43160d94f5a29f4ac8a2f97393d26e6a7a1282d3f76d48dda02f"} Apr 20 21:14:08.642684 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.642660 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-hjvgw" event={"ID":"838a3bd4-1a50-4127-a629-525bfede6ffd","Type":"ContainerStarted","Data":"58b6fce0e9ccf898b532f029eaf17e725741eda4a919305ee14abed2fa35dee9"} Apr 20 21:14:08.643479 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.643460 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-s4nps" event={"ID":"6500e95b-38f5-4c5c-b0f3-f38cf82ffcb6","Type":"ContainerStarted","Data":"b615050ebddd487efda88499e7602cbe42758a53f255aa289f52f330656fa7be"} Apr 20 21:14:08.729326 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.729260 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/299bba46-6418-4baf-8a89-6db7597a7bc4-service-ca-bundle\") pod \"router-default-79b56464d-ztxdm\" (UID: \"299bba46-6418-4baf-8a89-6db7597a7bc4\") " pod="openshift-ingress/router-default-79b56464d-ztxdm" Apr 20 21:14:08.729326 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:08.729311 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/299bba46-6418-4baf-8a89-6db7597a7bc4-metrics-certs\") pod \"router-default-79b56464d-ztxdm\" (UID: \"299bba46-6418-4baf-8a89-6db7597a7bc4\") " pod="openshift-ingress/router-default-79b56464d-ztxdm" Apr 20 21:14:08.729454 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:08.729401 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 21:14:08.729454 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:08.729412 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/299bba46-6418-4baf-8a89-6db7597a7bc4-service-ca-bundle podName:299bba46-6418-4baf-8a89-6db7597a7bc4 nodeName:}" failed. No retries permitted until 2026-04-20 21:14:09.729396078 +0000 UTC m=+59.972355176 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/299bba46-6418-4baf-8a89-6db7597a7bc4-service-ca-bundle") pod "router-default-79b56464d-ztxdm" (UID: "299bba46-6418-4baf-8a89-6db7597a7bc4") : configmap references non-existent config key: service-ca.crt Apr 20 21:14:08.729454 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:08.729437 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/299bba46-6418-4baf-8a89-6db7597a7bc4-metrics-certs podName:299bba46-6418-4baf-8a89-6db7597a7bc4 nodeName:}" failed. No retries permitted until 2026-04-20 21:14:09.729426394 +0000 UTC m=+59.972385494 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/299bba46-6418-4baf-8a89-6db7597a7bc4-metrics-certs") pod "router-default-79b56464d-ztxdm" (UID: "299bba46-6418-4baf-8a89-6db7597a7bc4") : secret "router-metrics-certs-default" not found Apr 20 21:14:09.589710 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:09.589675 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q6skp" Apr 20 21:14:09.636387 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:09.636353 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/86ba4aaf-0c45-4967-a1d3-51755a3cd672-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-mvjf7\" (UID: \"86ba4aaf-0c45-4967-a1d3-51755a3cd672\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-mvjf7" Apr 20 21:14:09.636541 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:09.636518 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 21:14:09.636601 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:09.636587 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86ba4aaf-0c45-4967-a1d3-51755a3cd672-samples-operator-tls podName:86ba4aaf-0c45-4967-a1d3-51755a3cd672 nodeName:}" failed. No retries permitted until 2026-04-20 21:14:11.636566485 +0000 UTC m=+61.879525589 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/86ba4aaf-0c45-4967-a1d3-51755a3cd672-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-mvjf7" (UID: "86ba4aaf-0c45-4967-a1d3-51755a3cd672") : secret "samples-operator-tls" not found Apr 20 21:14:09.737651 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:09.737608 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/299bba46-6418-4baf-8a89-6db7597a7bc4-service-ca-bundle\") pod \"router-default-79b56464d-ztxdm\" (UID: \"299bba46-6418-4baf-8a89-6db7597a7bc4\") " pod="openshift-ingress/router-default-79b56464d-ztxdm" Apr 20 21:14:09.737815 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:09.737701 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/299bba46-6418-4baf-8a89-6db7597a7bc4-metrics-certs\") pod \"router-default-79b56464d-ztxdm\" (UID: \"299bba46-6418-4baf-8a89-6db7597a7bc4\") " pod="openshift-ingress/router-default-79b56464d-ztxdm" Apr 20 21:14:09.737879 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:09.737828 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 21:14:09.737931 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:09.737892 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/299bba46-6418-4baf-8a89-6db7597a7bc4-metrics-certs podName:299bba46-6418-4baf-8a89-6db7597a7bc4 nodeName:}" failed. No retries permitted until 2026-04-20 21:14:11.737871907 +0000 UTC m=+61.980831020 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/299bba46-6418-4baf-8a89-6db7597a7bc4-metrics-certs") pod "router-default-79b56464d-ztxdm" (UID: "299bba46-6418-4baf-8a89-6db7597a7bc4") : secret "router-metrics-certs-default" not found Apr 20 21:14:09.738852 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:09.738829 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/299bba46-6418-4baf-8a89-6db7597a7bc4-service-ca-bundle podName:299bba46-6418-4baf-8a89-6db7597a7bc4 nodeName:}" failed. No retries permitted until 2026-04-20 21:14:11.738812835 +0000 UTC m=+61.981771936 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/299bba46-6418-4baf-8a89-6db7597a7bc4-service-ca-bundle") pod "router-default-79b56464d-ztxdm" (UID: "299bba46-6418-4baf-8a89-6db7597a7bc4") : configmap references non-existent config key: service-ca.crt Apr 20 21:14:11.655956 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:11.655935 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/86ba4aaf-0c45-4967-a1d3-51755a3cd672-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-mvjf7\" (UID: \"86ba4aaf-0c45-4967-a1d3-51755a3cd672\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-mvjf7" Apr 20 21:14:11.656269 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:11.656072 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 21:14:11.656269 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:11.656126 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86ba4aaf-0c45-4967-a1d3-51755a3cd672-samples-operator-tls podName:86ba4aaf-0c45-4967-a1d3-51755a3cd672 nodeName:}" failed. No retries permitted until 2026-04-20 21:14:15.656112753 +0000 UTC m=+65.899071854 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/86ba4aaf-0c45-4967-a1d3-51755a3cd672-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-mvjf7" (UID: "86ba4aaf-0c45-4967-a1d3-51755a3cd672") : secret "samples-operator-tls" not found Apr 20 21:14:11.757047 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:11.757004 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/299bba46-6418-4baf-8a89-6db7597a7bc4-service-ca-bundle\") pod \"router-default-79b56464d-ztxdm\" (UID: \"299bba46-6418-4baf-8a89-6db7597a7bc4\") " pod="openshift-ingress/router-default-79b56464d-ztxdm" Apr 20 21:14:11.757245 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:11.757091 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/299bba46-6418-4baf-8a89-6db7597a7bc4-metrics-certs\") pod \"router-default-79b56464d-ztxdm\" (UID: \"299bba46-6418-4baf-8a89-6db7597a7bc4\") " pod="openshift-ingress/router-default-79b56464d-ztxdm" Apr 20 21:14:11.757245 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:11.757157 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/299bba46-6418-4baf-8a89-6db7597a7bc4-service-ca-bundle podName:299bba46-6418-4baf-8a89-6db7597a7bc4 nodeName:}" failed. No retries permitted until 2026-04-20 21:14:15.757138498 +0000 UTC m=+66.000097596 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/299bba46-6418-4baf-8a89-6db7597a7bc4-service-ca-bundle") pod "router-default-79b56464d-ztxdm" (UID: "299bba46-6418-4baf-8a89-6db7597a7bc4") : configmap references non-existent config key: service-ca.crt Apr 20 21:14:11.757370 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:11.757250 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 21:14:11.757370 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:11.757303 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/299bba46-6418-4baf-8a89-6db7597a7bc4-metrics-certs podName:299bba46-6418-4baf-8a89-6db7597a7bc4 nodeName:}" failed. No retries permitted until 2026-04-20 21:14:15.757288059 +0000 UTC m=+66.000247174 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/299bba46-6418-4baf-8a89-6db7597a7bc4-metrics-certs") pod "router-default-79b56464d-ztxdm" (UID: "299bba46-6418-4baf-8a89-6db7597a7bc4") : secret "router-metrics-certs-default" not found Apr 20 21:14:12.654435 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:12.654401 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-xxkhk" event={"ID":"f6a03f80-2426-4087-b868-a71402310e22","Type":"ContainerStarted","Data":"dbeb3a3f530a63aa7e5769487daa90bac2702c580990af8c88c02164bcc1651d"} Apr 20 21:14:12.655787 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:12.655759 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8gwlz" event={"ID":"36c761e2-ffbf-4d69-8d19-9b3793a3acf9","Type":"ContainerStarted","Data":"a172960b27c2f9176eb327dda1829bab5da7c3aeba4c3aec96a1504dc73506bf"} Apr 20 21:14:12.657243 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:12.657223 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hjvgw_838a3bd4-1a50-4127-a629-525bfede6ffd/console-operator/0.log" Apr 20 21:14:12.657587 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:12.657259 2571 generic.go:358] "Generic (PLEG): container finished" podID="838a3bd4-1a50-4127-a629-525bfede6ffd" containerID="e1f235dc1c18670bb08957f0f49a90cd2a286dffd6ebaa1f0695aa0e1feb7f69" exitCode=255 Apr 20 21:14:12.657587 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:12.657312 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-hjvgw" event={"ID":"838a3bd4-1a50-4127-a629-525bfede6ffd","Type":"ContainerDied","Data":"e1f235dc1c18670bb08957f0f49a90cd2a286dffd6ebaa1f0695aa0e1feb7f69"} Apr 20 21:14:12.657587 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:12.657515 2571 scope.go:117] "RemoveContainer" containerID="e1f235dc1c18670bb08957f0f49a90cd2a286dffd6ebaa1f0695aa0e1feb7f69" Apr 20 21:14:12.658689 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:12.658669 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-s4nps" event={"ID":"6500e95b-38f5-4c5c-b0f3-f38cf82ffcb6","Type":"ContainerStarted","Data":"ba2bec85b89bd597c59aa66549e0883630147a3f8dd79a3fe8f7f3bb74b7038a"} Apr 20 21:14:12.671003 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:12.670964 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-xxkhk" podStartSLOduration=2.4909632090000002 podStartE2EDuration="5.670953854s" podCreationTimestamp="2026-04-20 21:14:07 +0000 UTC" firstStartedPulling="2026-04-20 21:14:08.497056172 +0000 UTC m=+58.740015270" lastFinishedPulling="2026-04-20 21:14:11.677046802 +0000 UTC m=+61.920005915" observedRunningTime="2026-04-20 21:14:12.670032346 +0000 UTC m=+62.912991468" watchObservedRunningTime="2026-04-20 21:14:12.670953854 +0000 UTC m=+62.913912973" Apr 20 21:14:12.705517 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:12.705265 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8gwlz" podStartSLOduration=2.393558084 podStartE2EDuration="5.705247533s" podCreationTimestamp="2026-04-20 21:14:07 +0000 UTC" firstStartedPulling="2026-04-20 21:14:08.322859881 +0000 UTC m=+58.565818980" lastFinishedPulling="2026-04-20 21:14:11.634549317 +0000 UTC m=+61.877508429" observedRunningTime="2026-04-20 21:14:12.704654542 +0000 UTC m=+62.947613853" watchObservedRunningTime="2026-04-20 21:14:12.705247533 +0000 UTC m=+62.948206651" Apr 20 21:14:12.734432 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:12.734389 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-s4nps" podStartSLOduration=2.544267884 podStartE2EDuration="5.73437432s" podCreationTimestamp="2026-04-20 21:14:07 +0000 UTC" firstStartedPulling="2026-04-20 21:14:08.440302437 +0000 UTC m=+58.683261535" lastFinishedPulling="2026-04-20 21:14:11.630408858 +0000 UTC m=+61.873367971" observedRunningTime="2026-04-20 21:14:12.733736799 +0000 UTC m=+62.976695920" watchObservedRunningTime="2026-04-20 21:14:12.73437432 +0000 UTC m=+62.977333418" Apr 20 21:14:12.792702 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:12.792675 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-scg9x"] Apr 20 21:14:12.797618 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:12.797598 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-scg9x" Apr 20 21:14:12.799605 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:12.799579 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 20 21:14:12.799727 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:12.799634 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-244k7\"" Apr 20 21:14:12.799875 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:12.799860 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 20 21:14:12.804411 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:12.804389 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-scg9x"] Apr 20 21:14:12.865446 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:12.865426 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z8tz\" (UniqueName: \"kubernetes.io/projected/d157b838-4286-4cdb-9399-eea3bc5bb5fd-kube-api-access-2z8tz\") pod \"migrator-74bb7799d9-scg9x\" (UID: \"d157b838-4286-4cdb-9399-eea3bc5bb5fd\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-scg9x" Apr 20 21:14:12.966194 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:12.966121 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2z8tz\" (UniqueName: \"kubernetes.io/projected/d157b838-4286-4cdb-9399-eea3bc5bb5fd-kube-api-access-2z8tz\") pod \"migrator-74bb7799d9-scg9x\" (UID: \"d157b838-4286-4cdb-9399-eea3bc5bb5fd\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-scg9x" Apr 20 21:14:12.974116 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:12.974086 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z8tz\" (UniqueName: \"kubernetes.io/projected/d157b838-4286-4cdb-9399-eea3bc5bb5fd-kube-api-access-2z8tz\") pod \"migrator-74bb7799d9-scg9x\" (UID: \"d157b838-4286-4cdb-9399-eea3bc5bb5fd\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-scg9x" Apr 20 21:14:13.142396 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:13.142354 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-scg9x" Apr 20 21:14:13.257172 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:13.257137 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-scg9x"] Apr 20 21:14:13.261203 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:14:13.261154 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd157b838_4286_4cdb_9399_eea3bc5bb5fd.slice/crio-53e8e160ad060a7b41c2b9deb9b19af481ec057d351600edd2cc3d64fc02d2e0 WatchSource:0}: Error finding container 53e8e160ad060a7b41c2b9deb9b19af481ec057d351600edd2cc3d64fc02d2e0: Status 404 returned error can't find the container with id 53e8e160ad060a7b41c2b9deb9b19af481ec057d351600edd2cc3d64fc02d2e0 Apr 20 21:14:13.662822 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:13.662795 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hjvgw_838a3bd4-1a50-4127-a629-525bfede6ffd/console-operator/1.log" Apr 20 21:14:13.663230 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:13.663123 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hjvgw_838a3bd4-1a50-4127-a629-525bfede6ffd/console-operator/0.log" Apr 20 21:14:13.663230 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:13.663154 2571 generic.go:358] "Generic (PLEG): container finished" podID="838a3bd4-1a50-4127-a629-525bfede6ffd" containerID="b2069d6e2e91f02f40038c6ba50c3e8b932c98286a1a981c3c2c58c8212e5f4e" exitCode=255 Apr 20 21:14:13.663349 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:13.663232 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-hjvgw" event={"ID":"838a3bd4-1a50-4127-a629-525bfede6ffd","Type":"ContainerDied","Data":"b2069d6e2e91f02f40038c6ba50c3e8b932c98286a1a981c3c2c58c8212e5f4e"} Apr 20 21:14:13.663349 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:13.663261 2571 scope.go:117] "RemoveContainer" containerID="e1f235dc1c18670bb08957f0f49a90cd2a286dffd6ebaa1f0695aa0e1feb7f69" Apr 20 21:14:13.663546 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:13.663519 2571 scope.go:117] "RemoveContainer" containerID="b2069d6e2e91f02f40038c6ba50c3e8b932c98286a1a981c3c2c58c8212e5f4e" Apr 20 21:14:13.663733 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:13.663708 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-hjvgw_openshift-console-operator(838a3bd4-1a50-4127-a629-525bfede6ffd)\"" pod="openshift-console-operator/console-operator-9d4b6777b-hjvgw" podUID="838a3bd4-1a50-4127-a629-525bfede6ffd" Apr 20 21:14:13.664468 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:13.664349 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-scg9x" event={"ID":"d157b838-4286-4cdb-9399-eea3bc5bb5fd","Type":"ContainerStarted","Data":"53e8e160ad060a7b41c2b9deb9b19af481ec057d351600edd2cc3d64fc02d2e0"} Apr 20 21:14:13.707499 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:13.707476 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-9nwmh"] Apr 20 21:14:13.711500 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:13.711485 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9nwmh" Apr 20 21:14:13.713358 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:13.713340 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-wqrrd\"" Apr 20 21:14:13.717014 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:13.716995 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-9nwmh"] Apr 20 21:14:13.772424 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:13.772402 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr5fn\" (UniqueName: \"kubernetes.io/projected/f41d2d0b-36e7-42ab-a7e1-486ca3970554-kube-api-access-sr5fn\") pod \"network-check-source-8894fc9bd-9nwmh\" (UID: \"f41d2d0b-36e7-42ab-a7e1-486ca3970554\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9nwmh" Apr 20 21:14:13.873262 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:13.873220 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sr5fn\" (UniqueName: \"kubernetes.io/projected/f41d2d0b-36e7-42ab-a7e1-486ca3970554-kube-api-access-sr5fn\") pod \"network-check-source-8894fc9bd-9nwmh\" (UID: \"f41d2d0b-36e7-42ab-a7e1-486ca3970554\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9nwmh" Apr 20 21:14:13.881311 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:13.881283 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr5fn\" (UniqueName: \"kubernetes.io/projected/f41d2d0b-36e7-42ab-a7e1-486ca3970554-kube-api-access-sr5fn\") pod \"network-check-source-8894fc9bd-9nwmh\" (UID: \"f41d2d0b-36e7-42ab-a7e1-486ca3970554\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9nwmh" Apr 20 21:14:14.019648 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:14.019610 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9nwmh" Apr 20 21:14:14.155798 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:14.155763 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-9nwmh"] Apr 20 21:14:14.159524 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:14:14.159496 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf41d2d0b_36e7_42ab_a7e1_486ca3970554.slice/crio-a091a2bb465c05472ebe68d213f4e1bc299c80f9113cc27278edfc7a114b265e WatchSource:0}: Error finding container a091a2bb465c05472ebe68d213f4e1bc299c80f9113cc27278edfc7a114b265e: Status 404 returned error can't find the container with id a091a2bb465c05472ebe68d213f4e1bc299c80f9113cc27278edfc7a114b265e Apr 20 21:14:14.669499 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:14.669481 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hjvgw_838a3bd4-1a50-4127-a629-525bfede6ffd/console-operator/1.log" Apr 20 21:14:14.669852 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:14.669831 2571 scope.go:117] "RemoveContainer" containerID="b2069d6e2e91f02f40038c6ba50c3e8b932c98286a1a981c3c2c58c8212e5f4e" Apr 20 21:14:14.670050 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:14.670030 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-hjvgw_openshift-console-operator(838a3bd4-1a50-4127-a629-525bfede6ffd)\"" pod="openshift-console-operator/console-operator-9d4b6777b-hjvgw" podUID="838a3bd4-1a50-4127-a629-525bfede6ffd" Apr 20 21:14:14.670944 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:14.670921 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9nwmh" event={"ID":"f41d2d0b-36e7-42ab-a7e1-486ca3970554","Type":"ContainerStarted","Data":"ccdb0682ebdbd755f879a8a2e2e4fea97f5ec5bfffcf33a49440d7af5fc7cbd4"} Apr 20 21:14:14.671039 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:14.670954 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9nwmh" event={"ID":"f41d2d0b-36e7-42ab-a7e1-486ca3970554","Type":"ContainerStarted","Data":"a091a2bb465c05472ebe68d213f4e1bc299c80f9113cc27278edfc7a114b265e"} Apr 20 21:14:14.700403 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:14.700356 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9nwmh" podStartSLOduration=1.700340019 podStartE2EDuration="1.700340019s" podCreationTimestamp="2026-04-20 21:14:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:14:14.698857815 +0000 UTC m=+64.941816937" watchObservedRunningTime="2026-04-20 21:14:14.700340019 +0000 UTC m=+64.943299140" Apr 20 21:14:14.983962 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:14.983933 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/08619f25-e76b-45d3-ab4b-8e9490d505f9-cert\") pod \"ingress-canary-ld4fl\" (UID: \"08619f25-e76b-45d3-ab4b-8e9490d505f9\") " pod="openshift-ingress-canary/ingress-canary-ld4fl" Apr 20 21:14:14.984119 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:14.983979 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/096e06e8-bf34-462d-9f43-fd87848fd09e-metrics-tls\") pod \"dns-default-cqwmd\" (UID: \"096e06e8-bf34-462d-9f43-fd87848fd09e\") " pod="openshift-dns/dns-default-cqwmd" Apr 20 21:14:14.984119 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:14.984083 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 21:14:14.984119 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:14.984109 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 21:14:14.984247 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:14.984149 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08619f25-e76b-45d3-ab4b-8e9490d505f9-cert podName:08619f25-e76b-45d3-ab4b-8e9490d505f9 nodeName:}" failed. No retries permitted until 2026-04-20 21:14:46.984131701 +0000 UTC m=+97.227090814 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/08619f25-e76b-45d3-ab4b-8e9490d505f9-cert") pod "ingress-canary-ld4fl" (UID: "08619f25-e76b-45d3-ab4b-8e9490d505f9") : secret "canary-serving-cert" not found Apr 20 21:14:14.984247 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:14.984162 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/096e06e8-bf34-462d-9f43-fd87848fd09e-metrics-tls podName:096e06e8-bf34-462d-9f43-fd87848fd09e nodeName:}" failed. No retries permitted until 2026-04-20 21:14:46.984156654 +0000 UTC m=+97.227115751 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/096e06e8-bf34-462d-9f43-fd87848fd09e-metrics-tls") pod "dns-default-cqwmd" (UID: "096e06e8-bf34-462d-9f43-fd87848fd09e") : secret "dns-default-metrics-tls" not found Apr 20 21:14:15.224716 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:15.224676 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-vlrk4"] Apr 20 21:14:15.228664 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:15.228636 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-vlrk4" Apr 20 21:14:15.230698 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:15.230670 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 20 21:14:15.230845 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:15.230781 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 20 21:14:15.230962 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:15.230947 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-m588x\"" Apr 20 21:14:15.231021 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:15.230968 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 20 21:14:15.231136 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:15.231118 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 20 21:14:15.233246 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:15.233226 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-vlrk4"] Apr 20 21:14:15.286998 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:15.286971 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6e15932d-aa1a-465b-9803-41c46fe3fdcf-signing-cabundle\") pod \"service-ca-865cb79987-vlrk4\" (UID: \"6e15932d-aa1a-465b-9803-41c46fe3fdcf\") " pod="openshift-service-ca/service-ca-865cb79987-vlrk4" Apr 20 21:14:15.287144 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:15.287023 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vbzn\" (UniqueName: \"kubernetes.io/projected/6e15932d-aa1a-465b-9803-41c46fe3fdcf-kube-api-access-8vbzn\") pod \"service-ca-865cb79987-vlrk4\" (UID: \"6e15932d-aa1a-465b-9803-41c46fe3fdcf\") " pod="openshift-service-ca/service-ca-865cb79987-vlrk4" Apr 20 21:14:15.287207 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:15.287146 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6e15932d-aa1a-465b-9803-41c46fe3fdcf-signing-key\") pod \"service-ca-865cb79987-vlrk4\" (UID: \"6e15932d-aa1a-465b-9803-41c46fe3fdcf\") " pod="openshift-service-ca/service-ca-865cb79987-vlrk4" Apr 20 21:14:15.387805 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:15.387751 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6e15932d-aa1a-465b-9803-41c46fe3fdcf-signing-key\") pod \"service-ca-865cb79987-vlrk4\" (UID: \"6e15932d-aa1a-465b-9803-41c46fe3fdcf\") " pod="openshift-service-ca/service-ca-865cb79987-vlrk4" Apr 20 21:14:15.387805 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:15.387809 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6e15932d-aa1a-465b-9803-41c46fe3fdcf-signing-cabundle\") pod \"service-ca-865cb79987-vlrk4\" (UID: \"6e15932d-aa1a-465b-9803-41c46fe3fdcf\") " pod="openshift-service-ca/service-ca-865cb79987-vlrk4" Apr 20 21:14:15.388011 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:15.387850 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8vbzn\" (UniqueName: \"kubernetes.io/projected/6e15932d-aa1a-465b-9803-41c46fe3fdcf-kube-api-access-8vbzn\") pod \"service-ca-865cb79987-vlrk4\" (UID: \"6e15932d-aa1a-465b-9803-41c46fe3fdcf\") " pod="openshift-service-ca/service-ca-865cb79987-vlrk4" Apr 20 21:14:15.389196 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:15.389158 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6e15932d-aa1a-465b-9803-41c46fe3fdcf-signing-cabundle\") pod \"service-ca-865cb79987-vlrk4\" (UID: \"6e15932d-aa1a-465b-9803-41c46fe3fdcf\") " pod="openshift-service-ca/service-ca-865cb79987-vlrk4" Apr 20 21:14:15.390985 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:15.390964 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6e15932d-aa1a-465b-9803-41c46fe3fdcf-signing-key\") pod \"service-ca-865cb79987-vlrk4\" (UID: \"6e15932d-aa1a-465b-9803-41c46fe3fdcf\") " pod="openshift-service-ca/service-ca-865cb79987-vlrk4" Apr 20 21:14:15.395280 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:15.395259 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vbzn\" (UniqueName: \"kubernetes.io/projected/6e15932d-aa1a-465b-9803-41c46fe3fdcf-kube-api-access-8vbzn\") pod \"service-ca-865cb79987-vlrk4\" (UID: \"6e15932d-aa1a-465b-9803-41c46fe3fdcf\") " pod="openshift-service-ca/service-ca-865cb79987-vlrk4" Apr 20 21:14:15.538410 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:15.538300 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-vlrk4" Apr 20 21:14:15.652473 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:15.652444 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-vlrk4"] Apr 20 21:14:15.656941 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:14:15.656911 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e15932d_aa1a_465b_9803_41c46fe3fdcf.slice/crio-d7ad6806d67e2e19292db1572c05b52257e15b99d5e7335c98bd917e7d8d8004 WatchSource:0}: Error finding container d7ad6806d67e2e19292db1572c05b52257e15b99d5e7335c98bd917e7d8d8004: Status 404 returned error can't find the container with id d7ad6806d67e2e19292db1572c05b52257e15b99d5e7335c98bd917e7d8d8004 Apr 20 21:14:15.675881 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:15.675850 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-scg9x" event={"ID":"d157b838-4286-4cdb-9399-eea3bc5bb5fd","Type":"ContainerStarted","Data":"d6766a9621919a9f0cb37e21f97ea3f081d231f0a62daa6b28f979f521cc714c"} Apr 20 21:14:15.676266 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:15.675886 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-scg9x" event={"ID":"d157b838-4286-4cdb-9399-eea3bc5bb5fd","Type":"ContainerStarted","Data":"89453f8f4dd9a3accd61f0a836ae48f6c30110da73b799cae2b9b7c6a62acfc1"} Apr 20 21:14:15.676967 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:15.676947 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-vlrk4" event={"ID":"6e15932d-aa1a-465b-9803-41c46fe3fdcf","Type":"ContainerStarted","Data":"d7ad6806d67e2e19292db1572c05b52257e15b99d5e7335c98bd917e7d8d8004"} Apr 20 21:14:15.690359 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:15.690310 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-scg9x" podStartSLOduration=2.320360863 podStartE2EDuration="3.690296803s" podCreationTimestamp="2026-04-20 21:14:12 +0000 UTC" firstStartedPulling="2026-04-20 21:14:13.263011315 +0000 UTC m=+63.505970417" lastFinishedPulling="2026-04-20 21:14:14.632947236 +0000 UTC m=+64.875906357" observedRunningTime="2026-04-20 21:14:15.689887486 +0000 UTC m=+65.932846604" watchObservedRunningTime="2026-04-20 21:14:15.690296803 +0000 UTC m=+65.933255926" Apr 20 21:14:15.691041 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:15.691022 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/86ba4aaf-0c45-4967-a1d3-51755a3cd672-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-mvjf7\" (UID: \"86ba4aaf-0c45-4967-a1d3-51755a3cd672\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-mvjf7" Apr 20 21:14:15.691128 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:15.691108 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 21:14:15.691231 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:15.691217 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86ba4aaf-0c45-4967-a1d3-51755a3cd672-samples-operator-tls podName:86ba4aaf-0c45-4967-a1d3-51755a3cd672 nodeName:}" failed. No retries permitted until 2026-04-20 21:14:23.691197309 +0000 UTC m=+73.934156421 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/86ba4aaf-0c45-4967-a1d3-51755a3cd672-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-mvjf7" (UID: "86ba4aaf-0c45-4967-a1d3-51755a3cd672") : secret "samples-operator-tls" not found Apr 20 21:14:15.694574 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:15.694560 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9xgfk_47953ca1-cc2f-4035-8d59-26be8c7a9516/dns-node-resolver/0.log" Apr 20 21:14:15.791443 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:15.791346 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/299bba46-6418-4baf-8a89-6db7597a7bc4-service-ca-bundle\") pod \"router-default-79b56464d-ztxdm\" (UID: \"299bba46-6418-4baf-8a89-6db7597a7bc4\") " pod="openshift-ingress/router-default-79b56464d-ztxdm" Apr 20 21:14:15.791443 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:15.791402 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/299bba46-6418-4baf-8a89-6db7597a7bc4-metrics-certs\") pod \"router-default-79b56464d-ztxdm\" (UID: \"299bba46-6418-4baf-8a89-6db7597a7bc4\") " pod="openshift-ingress/router-default-79b56464d-ztxdm" Apr 20 21:14:15.791636 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:15.791524 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/299bba46-6418-4baf-8a89-6db7597a7bc4-service-ca-bundle podName:299bba46-6418-4baf-8a89-6db7597a7bc4 nodeName:}" failed. No retries permitted until 2026-04-20 21:14:23.791502408 +0000 UTC m=+74.034461509 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/299bba46-6418-4baf-8a89-6db7597a7bc4-service-ca-bundle") pod "router-default-79b56464d-ztxdm" (UID: "299bba46-6418-4baf-8a89-6db7597a7bc4") : configmap references non-existent config key: service-ca.crt Apr 20 21:14:15.791636 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:15.791554 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 21:14:15.791636 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:15.791595 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/299bba46-6418-4baf-8a89-6db7597a7bc4-metrics-certs podName:299bba46-6418-4baf-8a89-6db7597a7bc4 nodeName:}" failed. No retries permitted until 2026-04-20 21:14:23.791583965 +0000 UTC m=+74.034543062 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/299bba46-6418-4baf-8a89-6db7597a7bc4-metrics-certs") pod "router-default-79b56464d-ztxdm" (UID: "299bba46-6418-4baf-8a89-6db7597a7bc4") : secret "router-metrics-certs-default" not found Apr 20 21:14:15.993852 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:15.993816 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89e3c54c-a866-4c9b-940d-54a417b5c964-metrics-certs\") pod \"network-metrics-daemon-fk9cw\" (UID: \"89e3c54c-a866-4c9b-940d-54a417b5c964\") " pod="openshift-multus/network-metrics-daemon-fk9cw" Apr 20 21:14:15.993994 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:15.993934 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 21:14:15.994029 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:15.993995 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89e3c54c-a866-4c9b-940d-54a417b5c964-metrics-certs podName:89e3c54c-a866-4c9b-940d-54a417b5c964 nodeName:}" failed. No retries permitted until 2026-04-20 21:15:19.993978058 +0000 UTC m=+130.236937176 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/89e3c54c-a866-4c9b-940d-54a417b5c964-metrics-certs") pod "network-metrics-daemon-fk9cw" (UID: "89e3c54c-a866-4c9b-940d-54a417b5c964") : secret "metrics-daemon-secret" not found Apr 20 21:14:16.496513 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:16.496486 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fwbvz_95826e15-25fd-44ed-bc3e-c54baaa50bb7/node-ca/0.log" Apr 20 21:14:17.685773 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:17.685690 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-vlrk4" event={"ID":"6e15932d-aa1a-465b-9803-41c46fe3fdcf","Type":"ContainerStarted","Data":"875f39e6f922085b1242a11fee8e090ef1dfcd28358cb1bff8a27656534b73d9"} Apr 20 21:14:17.699978 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:17.699924 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-vlrk4" podStartSLOduration=1.039139981 podStartE2EDuration="2.699910326s" podCreationTimestamp="2026-04-20 21:14:15 +0000 UTC" firstStartedPulling="2026-04-20 21:14:15.658816881 +0000 UTC m=+65.901775978" lastFinishedPulling="2026-04-20 21:14:17.31958721 +0000 UTC m=+67.562546323" observedRunningTime="2026-04-20 21:14:17.698938566 +0000 UTC m=+67.941897683" watchObservedRunningTime="2026-04-20 21:14:17.699910326 +0000 UTC m=+67.942869446" Apr 20 21:14:17.895462 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:17.895413 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-scg9x_d157b838-4286-4cdb-9399-eea3bc5bb5fd/migrator/0.log" Apr 20 21:14:18.095473 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:18.095440 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-scg9x_d157b838-4286-4cdb-9399-eea3bc5bb5fd/graceful-termination/0.log" Apr 20 21:14:18.294440 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:18.294398 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-hjvgw" Apr 20 21:14:18.294440 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:18.294447 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-hjvgw" Apr 20 21:14:18.294925 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:18.294907 2571 scope.go:117] "RemoveContainer" containerID="b2069d6e2e91f02f40038c6ba50c3e8b932c98286a1a981c3c2c58c8212e5f4e" Apr 20 21:14:18.295143 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:18.295119 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-hjvgw_openshift-console-operator(838a3bd4-1a50-4127-a629-525bfede6ffd)\"" pod="openshift-console-operator/console-operator-9d4b6777b-hjvgw" podUID="838a3bd4-1a50-4127-a629-525bfede6ffd" Apr 20 21:14:18.296855 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:18.296830 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-8gwlz_36c761e2-ffbf-4d69-8d19-9b3793a3acf9/kube-storage-version-migrator-operator/0.log" Apr 20 21:14:18.606573 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:18.606534 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-cxz7h" Apr 20 21:14:23.760945 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:23.760910 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/86ba4aaf-0c45-4967-a1d3-51755a3cd672-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-mvjf7\" (UID: \"86ba4aaf-0c45-4967-a1d3-51755a3cd672\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-mvjf7" Apr 20 21:14:23.763570 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:23.763540 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/86ba4aaf-0c45-4967-a1d3-51755a3cd672-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-mvjf7\" (UID: \"86ba4aaf-0c45-4967-a1d3-51755a3cd672\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-mvjf7" Apr 20 21:14:23.798309 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:23.798274 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-xwsvd\"" Apr 20 21:14:23.805982 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:23.805960 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-mvjf7" Apr 20 21:14:23.862372 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:23.862343 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/299bba46-6418-4baf-8a89-6db7597a7bc4-service-ca-bundle\") pod \"router-default-79b56464d-ztxdm\" (UID: \"299bba46-6418-4baf-8a89-6db7597a7bc4\") " pod="openshift-ingress/router-default-79b56464d-ztxdm" Apr 20 21:14:23.862514 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:23.862386 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/299bba46-6418-4baf-8a89-6db7597a7bc4-metrics-certs\") pod \"router-default-79b56464d-ztxdm\" (UID: \"299bba46-6418-4baf-8a89-6db7597a7bc4\") " pod="openshift-ingress/router-default-79b56464d-ztxdm" Apr 20 21:14:23.862629 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:23.862606 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/299bba46-6418-4baf-8a89-6db7597a7bc4-service-ca-bundle podName:299bba46-6418-4baf-8a89-6db7597a7bc4 nodeName:}" failed. No retries permitted until 2026-04-20 21:14:39.862579499 +0000 UTC m=+90.105538600 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/299bba46-6418-4baf-8a89-6db7597a7bc4-service-ca-bundle") pod "router-default-79b56464d-ztxdm" (UID: "299bba46-6418-4baf-8a89-6db7597a7bc4") : configmap references non-existent config key: service-ca.crt Apr 20 21:14:23.864859 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:23.864831 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/299bba46-6418-4baf-8a89-6db7597a7bc4-metrics-certs\") pod \"router-default-79b56464d-ztxdm\" (UID: \"299bba46-6418-4baf-8a89-6db7597a7bc4\") " pod="openshift-ingress/router-default-79b56464d-ztxdm" Apr 20 21:14:23.923007 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:23.922979 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-mvjf7"] Apr 20 21:14:24.708293 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:24.708248 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-mvjf7" event={"ID":"86ba4aaf-0c45-4967-a1d3-51755a3cd672","Type":"ContainerStarted","Data":"088970234c8b5fec86effacbda9369984550ac377848b325b1293e95345b2235"} Apr 20 21:14:26.715059 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:26.715018 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-mvjf7" event={"ID":"86ba4aaf-0c45-4967-a1d3-51755a3cd672","Type":"ContainerStarted","Data":"14fdf8300791ca1706f2c21b27274892b285380fc2a8198bb93e76644c453cba"} Apr 20 21:14:26.715059 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:26.715054 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-mvjf7" event={"ID":"86ba4aaf-0c45-4967-a1d3-51755a3cd672","Type":"ContainerStarted","Data":"77ee53383b98523baba497dbb998dd7622dbe9d356266098919de26679eac3ed"} Apr 20 21:14:26.730859 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:26.730806 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-mvjf7" podStartSLOduration=17.746343443 podStartE2EDuration="19.730792151s" podCreationTimestamp="2026-04-20 21:14:07 +0000 UTC" firstStartedPulling="2026-04-20 21:14:23.976638196 +0000 UTC m=+74.219597294" lastFinishedPulling="2026-04-20 21:14:25.961086889 +0000 UTC m=+76.204046002" observedRunningTime="2026-04-20 21:14:26.729262567 +0000 UTC m=+76.972221686" watchObservedRunningTime="2026-04-20 21:14:26.730792151 +0000 UTC m=+76.973751270" Apr 20 21:14:33.337407 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:33.337371 2571 scope.go:117] "RemoveContainer" containerID="b2069d6e2e91f02f40038c6ba50c3e8b932c98286a1a981c3c2c58c8212e5f4e" Apr 20 21:14:33.734997 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:33.734970 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hjvgw_838a3bd4-1a50-4127-a629-525bfede6ffd/console-operator/1.log" Apr 20 21:14:33.735282 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:33.735033 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-hjvgw" event={"ID":"838a3bd4-1a50-4127-a629-525bfede6ffd","Type":"ContainerStarted","Data":"32b03b1dde0788c128fd8bcc0ef9ad854d11fdb4099324a2cfcbe33eee1427f7"} Apr 20 21:14:33.735357 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:33.735281 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-hjvgw" Apr 20 21:14:33.751028 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:33.750974 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-hjvgw" podStartSLOduration=23.583523601 podStartE2EDuration="26.750960926s" podCreationTimestamp="2026-04-20 21:14:07 +0000 UTC" firstStartedPulling="2026-04-20 21:14:08.470099522 +0000 UTC m=+58.713058620" lastFinishedPulling="2026-04-20 21:14:11.637536833 +0000 UTC m=+61.880495945" observedRunningTime="2026-04-20 21:14:33.75021718 +0000 UTC m=+83.993176300" watchObservedRunningTime="2026-04-20 21:14:33.750960926 +0000 UTC m=+83.993920046" Apr 20 21:14:33.765831 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:33.765791 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-hjvgw" Apr 20 21:14:34.127539 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.127460 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-z4nxr"] Apr 20 21:14:34.130481 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.130448 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-z4nxr" Apr 20 21:14:34.132534 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.132508 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 21:14:34.132669 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.132642 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 21:14:34.132896 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.132879 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-4kwqq\"" Apr 20 21:14:34.139888 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.139865 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-z4nxr"] Apr 20 21:14:34.207966 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.207938 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-s2hsb"] Apr 20 21:14:34.210989 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.210968 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-s2hsb" Apr 20 21:14:34.214274 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.214253 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 21:14:34.214682 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.214666 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 21:14:34.215424 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.215410 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-9zcsc\"" Apr 20 21:14:34.222346 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.222319 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-56b9995c8d-m5vt9"] Apr 20 21:14:34.226381 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.226361 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56b9995c8d-m5vt9" Apr 20 21:14:34.233744 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.233724 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 21:14:34.233953 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.233938 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-4cm2d\"" Apr 20 21:14:34.234023 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.234011 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 21:14:34.234402 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.234386 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 21:14:34.239444 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.239425 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28xpc\" (UniqueName: \"kubernetes.io/projected/dd7b97ad-bb33-454d-96b8-cbbc807198ef-kube-api-access-28xpc\") pod \"insights-runtime-extractor-z4nxr\" (UID: \"dd7b97ad-bb33-454d-96b8-cbbc807198ef\") " pod="openshift-insights/insights-runtime-extractor-z4nxr" Apr 20 21:14:34.239517 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.239454 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dd7b97ad-bb33-454d-96b8-cbbc807198ef-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-z4nxr\" (UID: \"dd7b97ad-bb33-454d-96b8-cbbc807198ef\") " pod="openshift-insights/insights-runtime-extractor-z4nxr" Apr 20 21:14:34.239517 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.239476 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dd7b97ad-bb33-454d-96b8-cbbc807198ef-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-z4nxr\" (UID: \"dd7b97ad-bb33-454d-96b8-cbbc807198ef\") " pod="openshift-insights/insights-runtime-extractor-z4nxr" Apr 20 21:14:34.239588 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.239542 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dd7b97ad-bb33-454d-96b8-cbbc807198ef-data-volume\") pod \"insights-runtime-extractor-z4nxr\" (UID: \"dd7b97ad-bb33-454d-96b8-cbbc807198ef\") " pod="openshift-insights/insights-runtime-extractor-z4nxr" Apr 20 21:14:34.239622 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.239588 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dd7b97ad-bb33-454d-96b8-cbbc807198ef-crio-socket\") pod \"insights-runtime-extractor-z4nxr\" (UID: \"dd7b97ad-bb33-454d-96b8-cbbc807198ef\") " pod="openshift-insights/insights-runtime-extractor-z4nxr" Apr 20 21:14:34.243436 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.243418 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-s2hsb"] Apr 20 21:14:34.251593 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.251575 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 21:14:34.262552 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.262531 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-56b9995c8d-m5vt9"] Apr 20 21:14:34.339882 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.339854 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-28xpc\" (UniqueName: \"kubernetes.io/projected/dd7b97ad-bb33-454d-96b8-cbbc807198ef-kube-api-access-28xpc\") pod \"insights-runtime-extractor-z4nxr\" (UID: \"dd7b97ad-bb33-454d-96b8-cbbc807198ef\") " pod="openshift-insights/insights-runtime-extractor-z4nxr" Apr 20 21:14:34.340348 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.339889 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dd7b97ad-bb33-454d-96b8-cbbc807198ef-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-z4nxr\" (UID: \"dd7b97ad-bb33-454d-96b8-cbbc807198ef\") " pod="openshift-insights/insights-runtime-extractor-z4nxr" Apr 20 21:14:34.340348 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.339916 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/adb3353d-8a6b-4b5d-9dbe-795907ebf77a-image-registry-private-configuration\") pod \"image-registry-56b9995c8d-m5vt9\" (UID: \"adb3353d-8a6b-4b5d-9dbe-795907ebf77a\") " pod="openshift-image-registry/image-registry-56b9995c8d-m5vt9" Apr 20 21:14:34.340348 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.340090 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dd7b97ad-bb33-454d-96b8-cbbc807198ef-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-z4nxr\" (UID: \"dd7b97ad-bb33-454d-96b8-cbbc807198ef\") " pod="openshift-insights/insights-runtime-extractor-z4nxr" Apr 20 21:14:34.340348 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.340151 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/adb3353d-8a6b-4b5d-9dbe-795907ebf77a-registry-certificates\") pod \"image-registry-56b9995c8d-m5vt9\" (UID: \"adb3353d-8a6b-4b5d-9dbe-795907ebf77a\") " pod="openshift-image-registry/image-registry-56b9995c8d-m5vt9" Apr 20 21:14:34.340348 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.340196 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/adb3353d-8a6b-4b5d-9dbe-795907ebf77a-trusted-ca\") pod \"image-registry-56b9995c8d-m5vt9\" (UID: \"adb3353d-8a6b-4b5d-9dbe-795907ebf77a\") " pod="openshift-image-registry/image-registry-56b9995c8d-m5vt9" Apr 20 21:14:34.340348 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.340231 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dd7b97ad-bb33-454d-96b8-cbbc807198ef-data-volume\") pod \"insights-runtime-extractor-z4nxr\" (UID: \"dd7b97ad-bb33-454d-96b8-cbbc807198ef\") " pod="openshift-insights/insights-runtime-extractor-z4nxr" Apr 20 21:14:34.340348 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.340301 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrwwf\" (UniqueName: \"kubernetes.io/projected/adb3353d-8a6b-4b5d-9dbe-795907ebf77a-kube-api-access-zrwwf\") pod \"image-registry-56b9995c8d-m5vt9\" (UID: \"adb3353d-8a6b-4b5d-9dbe-795907ebf77a\") " pod="openshift-image-registry/image-registry-56b9995c8d-m5vt9" Apr 20 21:14:34.340718 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.340414 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/adb3353d-8a6b-4b5d-9dbe-795907ebf77a-registry-tls\") pod \"image-registry-56b9995c8d-m5vt9\" (UID: \"adb3353d-8a6b-4b5d-9dbe-795907ebf77a\") " pod="openshift-image-registry/image-registry-56b9995c8d-m5vt9" Apr 20 21:14:34.340718 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.340439 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dd7b97ad-bb33-454d-96b8-cbbc807198ef-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-z4nxr\" (UID: \"dd7b97ad-bb33-454d-96b8-cbbc807198ef\") " pod="openshift-insights/insights-runtime-extractor-z4nxr" Apr 20 21:14:34.340718 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.340445 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dd7b97ad-bb33-454d-96b8-cbbc807198ef-crio-socket\") pod \"insights-runtime-extractor-z4nxr\" (UID: \"dd7b97ad-bb33-454d-96b8-cbbc807198ef\") " pod="openshift-insights/insights-runtime-extractor-z4nxr" Apr 20 21:14:34.340718 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.340497 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dd7b97ad-bb33-454d-96b8-cbbc807198ef-crio-socket\") pod \"insights-runtime-extractor-z4nxr\" (UID: \"dd7b97ad-bb33-454d-96b8-cbbc807198ef\") " pod="openshift-insights/insights-runtime-extractor-z4nxr" Apr 20 21:14:34.340718 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.340525 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/adb3353d-8a6b-4b5d-9dbe-795907ebf77a-ca-trust-extracted\") pod \"image-registry-56b9995c8d-m5vt9\" (UID: \"adb3353d-8a6b-4b5d-9dbe-795907ebf77a\") " pod="openshift-image-registry/image-registry-56b9995c8d-m5vt9" Apr 20 21:14:34.340718 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.340571 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dd7b97ad-bb33-454d-96b8-cbbc807198ef-data-volume\") pod \"insights-runtime-extractor-z4nxr\" (UID: \"dd7b97ad-bb33-454d-96b8-cbbc807198ef\") " pod="openshift-insights/insights-runtime-extractor-z4nxr" Apr 20 21:14:34.340718 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.340577 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/adb3353d-8a6b-4b5d-9dbe-795907ebf77a-installation-pull-secrets\") pod \"image-registry-56b9995c8d-m5vt9\" (UID: \"adb3353d-8a6b-4b5d-9dbe-795907ebf77a\") " pod="openshift-image-registry/image-registry-56b9995c8d-m5vt9" Apr 20 21:14:34.340718 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.340663 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnskq\" (UniqueName: \"kubernetes.io/projected/ce5dfb59-8be5-484c-868f-587ecd9948e3-kube-api-access-wnskq\") pod \"downloads-6bcc868b7-s2hsb\" (UID: \"ce5dfb59-8be5-484c-868f-587ecd9948e3\") " pod="openshift-console/downloads-6bcc868b7-s2hsb" Apr 20 21:14:34.340718 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.340692 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/adb3353d-8a6b-4b5d-9dbe-795907ebf77a-bound-sa-token\") pod \"image-registry-56b9995c8d-m5vt9\" (UID: \"adb3353d-8a6b-4b5d-9dbe-795907ebf77a\") " pod="openshift-image-registry/image-registry-56b9995c8d-m5vt9" Apr 20 21:14:34.342548 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.342527 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dd7b97ad-bb33-454d-96b8-cbbc807198ef-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-z4nxr\" (UID: \"dd7b97ad-bb33-454d-96b8-cbbc807198ef\") " pod="openshift-insights/insights-runtime-extractor-z4nxr" Apr 20 21:14:34.348826 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.348804 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-28xpc\" (UniqueName: \"kubernetes.io/projected/dd7b97ad-bb33-454d-96b8-cbbc807198ef-kube-api-access-28xpc\") pod \"insights-runtime-extractor-z4nxr\" (UID: \"dd7b97ad-bb33-454d-96b8-cbbc807198ef\") " pod="openshift-insights/insights-runtime-extractor-z4nxr" Apr 20 21:14:34.440786 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.440709 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-z4nxr" Apr 20 21:14:34.441037 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.441017 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/adb3353d-8a6b-4b5d-9dbe-795907ebf77a-installation-pull-secrets\") pod \"image-registry-56b9995c8d-m5vt9\" (UID: \"adb3353d-8a6b-4b5d-9dbe-795907ebf77a\") " pod="openshift-image-registry/image-registry-56b9995c8d-m5vt9" Apr 20 21:14:34.441091 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.441066 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wnskq\" (UniqueName: \"kubernetes.io/projected/ce5dfb59-8be5-484c-868f-587ecd9948e3-kube-api-access-wnskq\") pod \"downloads-6bcc868b7-s2hsb\" (UID: \"ce5dfb59-8be5-484c-868f-587ecd9948e3\") " pod="openshift-console/downloads-6bcc868b7-s2hsb" Apr 20 21:14:34.441091 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.441085 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/adb3353d-8a6b-4b5d-9dbe-795907ebf77a-bound-sa-token\") pod \"image-registry-56b9995c8d-m5vt9\" (UID: \"adb3353d-8a6b-4b5d-9dbe-795907ebf77a\") " pod="openshift-image-registry/image-registry-56b9995c8d-m5vt9" Apr 20 21:14:34.441210 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.441104 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/adb3353d-8a6b-4b5d-9dbe-795907ebf77a-image-registry-private-configuration\") pod \"image-registry-56b9995c8d-m5vt9\" (UID: \"adb3353d-8a6b-4b5d-9dbe-795907ebf77a\") " pod="openshift-image-registry/image-registry-56b9995c8d-m5vt9" Apr 20 21:14:34.441210 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.441140 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/adb3353d-8a6b-4b5d-9dbe-795907ebf77a-registry-certificates\") pod \"image-registry-56b9995c8d-m5vt9\" (UID: \"adb3353d-8a6b-4b5d-9dbe-795907ebf77a\") " pod="openshift-image-registry/image-registry-56b9995c8d-m5vt9" Apr 20 21:14:34.441210 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.441162 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/adb3353d-8a6b-4b5d-9dbe-795907ebf77a-trusted-ca\") pod \"image-registry-56b9995c8d-m5vt9\" (UID: \"adb3353d-8a6b-4b5d-9dbe-795907ebf77a\") " pod="openshift-image-registry/image-registry-56b9995c8d-m5vt9" Apr 20 21:14:34.441364 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.441222 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zrwwf\" (UniqueName: \"kubernetes.io/projected/adb3353d-8a6b-4b5d-9dbe-795907ebf77a-kube-api-access-zrwwf\") pod \"image-registry-56b9995c8d-m5vt9\" (UID: \"adb3353d-8a6b-4b5d-9dbe-795907ebf77a\") " pod="openshift-image-registry/image-registry-56b9995c8d-m5vt9" Apr 20 21:14:34.441364 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.441255 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/adb3353d-8a6b-4b5d-9dbe-795907ebf77a-registry-tls\") pod \"image-registry-56b9995c8d-m5vt9\" (UID: \"adb3353d-8a6b-4b5d-9dbe-795907ebf77a\") " pod="openshift-image-registry/image-registry-56b9995c8d-m5vt9" Apr 20 21:14:34.441364 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.441302 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/adb3353d-8a6b-4b5d-9dbe-795907ebf77a-ca-trust-extracted\") pod \"image-registry-56b9995c8d-m5vt9\" (UID: \"adb3353d-8a6b-4b5d-9dbe-795907ebf77a\") " pod="openshift-image-registry/image-registry-56b9995c8d-m5vt9" Apr 20 21:14:34.442105 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.441672 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/adb3353d-8a6b-4b5d-9dbe-795907ebf77a-ca-trust-extracted\") pod \"image-registry-56b9995c8d-m5vt9\" (UID: \"adb3353d-8a6b-4b5d-9dbe-795907ebf77a\") " pod="openshift-image-registry/image-registry-56b9995c8d-m5vt9" Apr 20 21:14:34.442916 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.442893 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/adb3353d-8a6b-4b5d-9dbe-795907ebf77a-trusted-ca\") pod \"image-registry-56b9995c8d-m5vt9\" (UID: \"adb3353d-8a6b-4b5d-9dbe-795907ebf77a\") " pod="openshift-image-registry/image-registry-56b9995c8d-m5vt9" Apr 20 21:14:34.443041 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.442934 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/adb3353d-8a6b-4b5d-9dbe-795907ebf77a-registry-certificates\") pod \"image-registry-56b9995c8d-m5vt9\" (UID: \"adb3353d-8a6b-4b5d-9dbe-795907ebf77a\") " pod="openshift-image-registry/image-registry-56b9995c8d-m5vt9" Apr 20 21:14:34.443871 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.443849 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/adb3353d-8a6b-4b5d-9dbe-795907ebf77a-installation-pull-secrets\") pod \"image-registry-56b9995c8d-m5vt9\" (UID: \"adb3353d-8a6b-4b5d-9dbe-795907ebf77a\") " pod="openshift-image-registry/image-registry-56b9995c8d-m5vt9" Apr 20 21:14:34.444292 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.444271 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/adb3353d-8a6b-4b5d-9dbe-795907ebf77a-image-registry-private-configuration\") pod \"image-registry-56b9995c8d-m5vt9\" (UID: \"adb3353d-8a6b-4b5d-9dbe-795907ebf77a\") " pod="openshift-image-registry/image-registry-56b9995c8d-m5vt9" Apr 20 21:14:34.444825 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.444805 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/adb3353d-8a6b-4b5d-9dbe-795907ebf77a-registry-tls\") pod \"image-registry-56b9995c8d-m5vt9\" (UID: \"adb3353d-8a6b-4b5d-9dbe-795907ebf77a\") " pod="openshift-image-registry/image-registry-56b9995c8d-m5vt9" Apr 20 21:14:34.449471 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.449394 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnskq\" (UniqueName: \"kubernetes.io/projected/ce5dfb59-8be5-484c-868f-587ecd9948e3-kube-api-access-wnskq\") pod \"downloads-6bcc868b7-s2hsb\" (UID: \"ce5dfb59-8be5-484c-868f-587ecd9948e3\") " pod="openshift-console/downloads-6bcc868b7-s2hsb" Apr 20 21:14:34.449654 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.449631 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/adb3353d-8a6b-4b5d-9dbe-795907ebf77a-bound-sa-token\") pod \"image-registry-56b9995c8d-m5vt9\" (UID: \"adb3353d-8a6b-4b5d-9dbe-795907ebf77a\") " pod="openshift-image-registry/image-registry-56b9995c8d-m5vt9" Apr 20 21:14:34.449791 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.449774 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrwwf\" (UniqueName: \"kubernetes.io/projected/adb3353d-8a6b-4b5d-9dbe-795907ebf77a-kube-api-access-zrwwf\") pod \"image-registry-56b9995c8d-m5vt9\" (UID: \"adb3353d-8a6b-4b5d-9dbe-795907ebf77a\") " pod="openshift-image-registry/image-registry-56b9995c8d-m5vt9" Apr 20 21:14:34.519583 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.519550 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-s2hsb" Apr 20 21:14:34.535840 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.535807 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56b9995c8d-m5vt9" Apr 20 21:14:34.570049 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.569959 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-z4nxr"] Apr 20 21:14:34.576637 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:14:34.576567 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd7b97ad_bb33_454d_96b8_cbbc807198ef.slice/crio-d1eca968b9dfe877cb4251014fee942f1a9bcda37deaa1204c5a67751eb73c4f WatchSource:0}: Error finding container d1eca968b9dfe877cb4251014fee942f1a9bcda37deaa1204c5a67751eb73c4f: Status 404 returned error can't find the container with id d1eca968b9dfe877cb4251014fee942f1a9bcda37deaa1204c5a67751eb73c4f Apr 20 21:14:34.659011 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.658981 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-s2hsb"] Apr 20 21:14:34.661972 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:14:34.661927 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce5dfb59_8be5_484c_868f_587ecd9948e3.slice/crio-96cf7903469cf1a999ec7f86d38116b6edb4273ac2e7093df348b427ef2454ac WatchSource:0}: Error finding container 96cf7903469cf1a999ec7f86d38116b6edb4273ac2e7093df348b427ef2454ac: Status 404 returned error can't find the container with id 96cf7903469cf1a999ec7f86d38116b6edb4273ac2e7093df348b427ef2454ac Apr 20 21:14:34.676116 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.676087 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-56b9995c8d-m5vt9"] Apr 20 21:14:34.682519 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:14:34.682490 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadb3353d_8a6b_4b5d_9dbe_795907ebf77a.slice/crio-48b9844a1086f9dbda6d799187d71ce3abe6ff344d7b44e948ff965ed6308a22 WatchSource:0}: Error finding container 48b9844a1086f9dbda6d799187d71ce3abe6ff344d7b44e948ff965ed6308a22: Status 404 returned error can't find the container with id 48b9844a1086f9dbda6d799187d71ce3abe6ff344d7b44e948ff965ed6308a22 Apr 20 21:14:34.738315 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.738282 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-z4nxr" event={"ID":"dd7b97ad-bb33-454d-96b8-cbbc807198ef","Type":"ContainerStarted","Data":"07f89b83e68380d972a156eeaa9c86858683a6432b7fdccac5fed2bae68b1a09"} Apr 20 21:14:34.738426 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.738324 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-z4nxr" event={"ID":"dd7b97ad-bb33-454d-96b8-cbbc807198ef","Type":"ContainerStarted","Data":"d1eca968b9dfe877cb4251014fee942f1a9bcda37deaa1204c5a67751eb73c4f"} Apr 20 21:14:34.739804 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.739775 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-56b9995c8d-m5vt9" event={"ID":"adb3353d-8a6b-4b5d-9dbe-795907ebf77a","Type":"ContainerStarted","Data":"48b9844a1086f9dbda6d799187d71ce3abe6ff344d7b44e948ff965ed6308a22"} Apr 20 21:14:34.740770 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:34.740745 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-s2hsb" event={"ID":"ce5dfb59-8be5-484c-868f-587ecd9948e3","Type":"ContainerStarted","Data":"96cf7903469cf1a999ec7f86d38116b6edb4273ac2e7093df348b427ef2454ac"} Apr 20 21:14:35.744862 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:35.744824 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-z4nxr" event={"ID":"dd7b97ad-bb33-454d-96b8-cbbc807198ef","Type":"ContainerStarted","Data":"1b0dd0ce16d8d1d9e17adf03f96254c8e1e58d066830b435cfaf51933e8875cd"} Apr 20 21:14:35.746401 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:35.746373 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-56b9995c8d-m5vt9" event={"ID":"adb3353d-8a6b-4b5d-9dbe-795907ebf77a","Type":"ContainerStarted","Data":"e1dcae0706c1d9e219b2ff5b2f6ac2c253de003760aa565985e1d2e1cab2df7d"} Apr 20 21:14:35.768080 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:35.768022 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-56b9995c8d-m5vt9" podStartSLOduration=1.768000966 podStartE2EDuration="1.768000966s" podCreationTimestamp="2026-04-20 21:14:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:14:35.767147631 +0000 UTC m=+86.010106754" watchObservedRunningTime="2026-04-20 21:14:35.768000966 +0000 UTC m=+86.010960088" Apr 20 21:14:36.749789 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:36.749755 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-56b9995c8d-m5vt9" Apr 20 21:14:37.754074 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:37.754028 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-z4nxr" event={"ID":"dd7b97ad-bb33-454d-96b8-cbbc807198ef","Type":"ContainerStarted","Data":"16907bb1c24fab73f206804d72c8ad30a1f1b73a399ceed4b4a68b75958d0959"} Apr 20 21:14:37.770149 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:37.770105 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-z4nxr" podStartSLOduration=1.3440245960000001 podStartE2EDuration="3.770090885s" podCreationTimestamp="2026-04-20 21:14:34 +0000 UTC" firstStartedPulling="2026-04-20 21:14:34.669027285 +0000 UTC m=+84.911986383" lastFinishedPulling="2026-04-20 21:14:37.095093554 +0000 UTC m=+87.338052672" observedRunningTime="2026-04-20 21:14:37.76864008 +0000 UTC m=+88.011599203" watchObservedRunningTime="2026-04-20 21:14:37.770090885 +0000 UTC m=+88.013050006" Apr 20 21:14:39.894495 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:39.894464 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/299bba46-6418-4baf-8a89-6db7597a7bc4-service-ca-bundle\") pod \"router-default-79b56464d-ztxdm\" (UID: \"299bba46-6418-4baf-8a89-6db7597a7bc4\") " pod="openshift-ingress/router-default-79b56464d-ztxdm" Apr 20 21:14:39.895035 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:39.895017 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/299bba46-6418-4baf-8a89-6db7597a7bc4-service-ca-bundle\") pod \"router-default-79b56464d-ztxdm\" (UID: \"299bba46-6418-4baf-8a89-6db7597a7bc4\") " pod="openshift-ingress/router-default-79b56464d-ztxdm" Apr 20 21:14:40.109641 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:40.109607 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-82gsd\"" Apr 20 21:14:40.118343 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:40.118312 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-79b56464d-ztxdm" Apr 20 21:14:40.283370 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:40.283317 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-79b56464d-ztxdm"] Apr 20 21:14:40.286593 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:14:40.286564 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod299bba46_6418_4baf_8a89_6db7597a7bc4.slice/crio-97e5ba238633f678eaabf5acfdb2ce0788186765d970bbd0184c6014034273ad WatchSource:0}: Error finding container 97e5ba238633f678eaabf5acfdb2ce0788186765d970bbd0184c6014034273ad: Status 404 returned error can't find the container with id 97e5ba238633f678eaabf5acfdb2ce0788186765d970bbd0184c6014034273ad Apr 20 21:14:40.764579 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:40.764532 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-79b56464d-ztxdm" event={"ID":"299bba46-6418-4baf-8a89-6db7597a7bc4","Type":"ContainerStarted","Data":"8a809d94b5d1d6922bb27c5ef9141f130711a70ecb352be086d1bb9dd4784d36"} Apr 20 21:14:40.764772 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:40.764620 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-79b56464d-ztxdm" event={"ID":"299bba46-6418-4baf-8a89-6db7597a7bc4","Type":"ContainerStarted","Data":"97e5ba238633f678eaabf5acfdb2ce0788186765d970bbd0184c6014034273ad"} Apr 20 21:14:40.784366 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:40.784317 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-79b56464d-ztxdm" podStartSLOduration=33.784300158 podStartE2EDuration="33.784300158s" podCreationTimestamp="2026-04-20 21:14:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:14:40.782879278 +0000 UTC m=+91.025838426" watchObservedRunningTime="2026-04-20 21:14:40.784300158 +0000 UTC m=+91.027259279" Apr 20 21:14:41.118650 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:41.118561 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-79b56464d-ztxdm" Apr 20 21:14:41.121317 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:41.121293 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-79b56464d-ztxdm" Apr 20 21:14:41.768446 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:41.768410 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-79b56464d-ztxdm" Apr 20 21:14:41.769792 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:41.769769 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-79b56464d-ztxdm" Apr 20 21:14:47.060120 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:47.060079 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/08619f25-e76b-45d3-ab4b-8e9490d505f9-cert\") pod \"ingress-canary-ld4fl\" (UID: \"08619f25-e76b-45d3-ab4b-8e9490d505f9\") " pod="openshift-ingress-canary/ingress-canary-ld4fl" Apr 20 21:14:47.060539 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:47.060156 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/096e06e8-bf34-462d-9f43-fd87848fd09e-metrics-tls\") pod \"dns-default-cqwmd\" (UID: \"096e06e8-bf34-462d-9f43-fd87848fd09e\") " pod="openshift-dns/dns-default-cqwmd" Apr 20 21:14:47.062974 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:47.062923 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/096e06e8-bf34-462d-9f43-fd87848fd09e-metrics-tls\") pod \"dns-default-cqwmd\" (UID: \"096e06e8-bf34-462d-9f43-fd87848fd09e\") " pod="openshift-dns/dns-default-cqwmd" Apr 20 21:14:47.062974 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:47.062954 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/08619f25-e76b-45d3-ab4b-8e9490d505f9-cert\") pod \"ingress-canary-ld4fl\" (UID: \"08619f25-e76b-45d3-ab4b-8e9490d505f9\") " pod="openshift-ingress-canary/ingress-canary-ld4fl" Apr 20 21:14:47.085454 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:47.085420 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vtslr\"" Apr 20 21:14:47.093090 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:47.093062 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-kkzr9\"" Apr 20 21:14:47.094104 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:47.094082 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cqwmd" Apr 20 21:14:47.101770 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:47.101751 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ld4fl" Apr 20 21:14:51.219713 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:51.219683 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ld4fl"] Apr 20 21:14:51.228284 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:14:51.228253 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08619f25_e76b_45d3_ab4b_8e9490d505f9.slice/crio-9b72532c2f82564c5d17e46ca84eb44634debd7d5f744c5b68b2fa481737f018 WatchSource:0}: Error finding container 9b72532c2f82564c5d17e46ca84eb44634debd7d5f744c5b68b2fa481737f018: Status 404 returned error can't find the container with id 9b72532c2f82564c5d17e46ca84eb44634debd7d5f744c5b68b2fa481737f018 Apr 20 21:14:51.238066 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:51.238045 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cqwmd"] Apr 20 21:14:51.240054 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:14:51.240030 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod096e06e8_bf34_462d_9f43_fd87848fd09e.slice/crio-47aec3939158756502f9ee6a6a9775494fe9230f62d79d830671819ce7e26311 WatchSource:0}: Error finding container 47aec3939158756502f9ee6a6a9775494fe9230f62d79d830671819ce7e26311: Status 404 returned error can't find the container with id 47aec3939158756502f9ee6a6a9775494fe9230f62d79d830671819ce7e26311 Apr 20 21:14:51.799022 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:51.798980 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-s2hsb" event={"ID":"ce5dfb59-8be5-484c-868f-587ecd9948e3","Type":"ContainerStarted","Data":"0be1c21eec1665e3caef438986485942c76124664a7d2b49c6af26a8b2af5e7a"} Apr 20 21:14:51.799323 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:51.799287 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-s2hsb" Apr 20 21:14:51.800621 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:51.800581 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ld4fl" event={"ID":"08619f25-e76b-45d3-ab4b-8e9490d505f9","Type":"ContainerStarted","Data":"9b72532c2f82564c5d17e46ca84eb44634debd7d5f744c5b68b2fa481737f018"} Apr 20 21:14:51.803623 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:51.803599 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cqwmd" event={"ID":"096e06e8-bf34-462d-9f43-fd87848fd09e","Type":"ContainerStarted","Data":"47aec3939158756502f9ee6a6a9775494fe9230f62d79d830671819ce7e26311"} Apr 20 21:14:51.817072 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:51.817029 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-s2hsb" podStartSLOduration=1.288573103 podStartE2EDuration="17.817016046s" podCreationTimestamp="2026-04-20 21:14:34 +0000 UTC" firstStartedPulling="2026-04-20 21:14:34.664088527 +0000 UTC m=+84.907047624" lastFinishedPulling="2026-04-20 21:14:51.192531454 +0000 UTC m=+101.435490567" observedRunningTime="2026-04-20 21:14:51.815244097 +0000 UTC m=+102.058203218" watchObservedRunningTime="2026-04-20 21:14:51.817016046 +0000 UTC m=+102.059975165" Apr 20 21:14:51.819296 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:51.819274 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-s2hsb" Apr 20 21:14:52.600571 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.598308 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-826s9"] Apr 20 21:14:52.604226 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.603806 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-826s9" Apr 20 21:14:52.607324 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.607283 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-nmpg7"] Apr 20 21:14:52.608919 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.608883 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-hgk8k\"" Apr 20 21:14:52.609163 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.609134 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 20 21:14:52.609528 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.609495 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 21:14:52.610076 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.609698 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 21:14:52.610076 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.609861 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 20 21:14:52.610076 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.609913 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 21:14:52.612070 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.611406 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-nmpg7" Apr 20 21:14:52.613826 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.613795 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 21:14:52.614279 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.614229 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-hrxkm\"" Apr 20 21:14:52.614624 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.613809 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 21:14:52.615038 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.614935 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 21:14:52.615897 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.615845 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-826s9"] Apr 20 21:14:52.721229 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.717801 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0226bfa3-7ec0-490e-944a-1e60da426ea1-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-826s9\" (UID: \"0226bfa3-7ec0-490e-944a-1e60da426ea1\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-826s9" Apr 20 21:14:52.721229 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.717870 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwjgf\" (UniqueName: \"kubernetes.io/projected/0226bfa3-7ec0-490e-944a-1e60da426ea1-kube-api-access-pwjgf\") pod \"openshift-state-metrics-9d44df66c-826s9\" (UID: \"0226bfa3-7ec0-490e-944a-1e60da426ea1\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-826s9" Apr 20 21:14:52.721229 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.717908 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/67b17fac-14b3-453d-b0aa-0062a9cf986e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nmpg7\" (UID: \"67b17fac-14b3-453d-b0aa-0062a9cf986e\") " pod="openshift-monitoring/node-exporter-nmpg7" Apr 20 21:14:52.721229 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.717939 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/67b17fac-14b3-453d-b0aa-0062a9cf986e-node-exporter-textfile\") pod \"node-exporter-nmpg7\" (UID: \"67b17fac-14b3-453d-b0aa-0062a9cf986e\") " pod="openshift-monitoring/node-exporter-nmpg7" Apr 20 21:14:52.721229 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.717964 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/67b17fac-14b3-453d-b0aa-0062a9cf986e-node-exporter-tls\") pod \"node-exporter-nmpg7\" (UID: \"67b17fac-14b3-453d-b0aa-0062a9cf986e\") " pod="openshift-monitoring/node-exporter-nmpg7" Apr 20 21:14:52.721229 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.718004 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0226bfa3-7ec0-490e-944a-1e60da426ea1-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-826s9\" (UID: \"0226bfa3-7ec0-490e-944a-1e60da426ea1\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-826s9" Apr 20 21:14:52.721229 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.718065 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/67b17fac-14b3-453d-b0aa-0062a9cf986e-sys\") pod \"node-exporter-nmpg7\" (UID: \"67b17fac-14b3-453d-b0aa-0062a9cf986e\") " pod="openshift-monitoring/node-exporter-nmpg7" Apr 20 21:14:52.721229 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.718117 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/67b17fac-14b3-453d-b0aa-0062a9cf986e-node-exporter-accelerators-collector-config\") pod \"node-exporter-nmpg7\" (UID: \"67b17fac-14b3-453d-b0aa-0062a9cf986e\") " pod="openshift-monitoring/node-exporter-nmpg7" Apr 20 21:14:52.721229 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.718204 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0226bfa3-7ec0-490e-944a-1e60da426ea1-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-826s9\" (UID: \"0226bfa3-7ec0-490e-944a-1e60da426ea1\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-826s9" Apr 20 21:14:52.721229 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.718231 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/67b17fac-14b3-453d-b0aa-0062a9cf986e-metrics-client-ca\") pod \"node-exporter-nmpg7\" (UID: \"67b17fac-14b3-453d-b0aa-0062a9cf986e\") " pod="openshift-monitoring/node-exporter-nmpg7" Apr 20 21:14:52.721229 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.718285 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/67b17fac-14b3-453d-b0aa-0062a9cf986e-root\") pod \"node-exporter-nmpg7\" (UID: \"67b17fac-14b3-453d-b0aa-0062a9cf986e\") " pod="openshift-monitoring/node-exporter-nmpg7" Apr 20 21:14:52.721229 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.718341 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/67b17fac-14b3-453d-b0aa-0062a9cf986e-node-exporter-wtmp\") pod \"node-exporter-nmpg7\" (UID: \"67b17fac-14b3-453d-b0aa-0062a9cf986e\") " pod="openshift-monitoring/node-exporter-nmpg7" Apr 20 21:14:52.721229 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.718373 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5xbz\" (UniqueName: \"kubernetes.io/projected/67b17fac-14b3-453d-b0aa-0062a9cf986e-kube-api-access-r5xbz\") pod \"node-exporter-nmpg7\" (UID: \"67b17fac-14b3-453d-b0aa-0062a9cf986e\") " pod="openshift-monitoring/node-exporter-nmpg7" Apr 20 21:14:52.821341 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.821313 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pwjgf\" (UniqueName: \"kubernetes.io/projected/0226bfa3-7ec0-490e-944a-1e60da426ea1-kube-api-access-pwjgf\") pod \"openshift-state-metrics-9d44df66c-826s9\" (UID: \"0226bfa3-7ec0-490e-944a-1e60da426ea1\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-826s9" Apr 20 21:14:52.821604 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.821587 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/67b17fac-14b3-453d-b0aa-0062a9cf986e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nmpg7\" (UID: \"67b17fac-14b3-453d-b0aa-0062a9cf986e\") " pod="openshift-monitoring/node-exporter-nmpg7" Apr 20 21:14:52.821753 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.821741 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/67b17fac-14b3-453d-b0aa-0062a9cf986e-node-exporter-textfile\") pod \"node-exporter-nmpg7\" (UID: \"67b17fac-14b3-453d-b0aa-0062a9cf986e\") " pod="openshift-monitoring/node-exporter-nmpg7" Apr 20 21:14:52.821976 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:52.821963 2571 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 21:14:52.822618 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.822068 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/67b17fac-14b3-453d-b0aa-0062a9cf986e-node-exporter-tls\") pod \"node-exporter-nmpg7\" (UID: \"67b17fac-14b3-453d-b0aa-0062a9cf986e\") " pod="openshift-monitoring/node-exporter-nmpg7" Apr 20 21:14:52.822618 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:52.822146 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67b17fac-14b3-453d-b0aa-0062a9cf986e-node-exporter-tls podName:67b17fac-14b3-453d-b0aa-0062a9cf986e nodeName:}" failed. No retries permitted until 2026-04-20 21:14:53.322125465 +0000 UTC m=+103.565084567 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/67b17fac-14b3-453d-b0aa-0062a9cf986e-node-exporter-tls") pod "node-exporter-nmpg7" (UID: "67b17fac-14b3-453d-b0aa-0062a9cf986e") : secret "node-exporter-tls" not found Apr 20 21:14:52.822816 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.822172 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0226bfa3-7ec0-490e-944a-1e60da426ea1-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-826s9\" (UID: \"0226bfa3-7ec0-490e-944a-1e60da426ea1\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-826s9" Apr 20 21:14:52.823205 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.823152 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/67b17fac-14b3-453d-b0aa-0062a9cf986e-node-exporter-textfile\") pod \"node-exporter-nmpg7\" (UID: \"67b17fac-14b3-453d-b0aa-0062a9cf986e\") " pod="openshift-monitoring/node-exporter-nmpg7" Apr 20 21:14:52.823301 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.823282 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/67b17fac-14b3-453d-b0aa-0062a9cf986e-sys\") pod \"node-exporter-nmpg7\" (UID: \"67b17fac-14b3-453d-b0aa-0062a9cf986e\") " pod="openshift-monitoring/node-exporter-nmpg7" Apr 20 21:14:52.827077 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.825712 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/67b17fac-14b3-453d-b0aa-0062a9cf986e-sys\") pod \"node-exporter-nmpg7\" (UID: \"67b17fac-14b3-453d-b0aa-0062a9cf986e\") " pod="openshift-monitoring/node-exporter-nmpg7" Apr 20 21:14:52.827077 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.825806 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/67b17fac-14b3-453d-b0aa-0062a9cf986e-node-exporter-accelerators-collector-config\") pod \"node-exporter-nmpg7\" (UID: \"67b17fac-14b3-453d-b0aa-0062a9cf986e\") " pod="openshift-monitoring/node-exporter-nmpg7" Apr 20 21:14:52.827077 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.825847 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0226bfa3-7ec0-490e-944a-1e60da426ea1-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-826s9\" (UID: \"0226bfa3-7ec0-490e-944a-1e60da426ea1\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-826s9" Apr 20 21:14:52.827077 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.825874 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/67b17fac-14b3-453d-b0aa-0062a9cf986e-metrics-client-ca\") pod \"node-exporter-nmpg7\" (UID: \"67b17fac-14b3-453d-b0aa-0062a9cf986e\") " pod="openshift-monitoring/node-exporter-nmpg7" Apr 20 21:14:52.827077 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.825916 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/67b17fac-14b3-453d-b0aa-0062a9cf986e-root\") pod \"node-exporter-nmpg7\" (UID: \"67b17fac-14b3-453d-b0aa-0062a9cf986e\") " pod="openshift-monitoring/node-exporter-nmpg7" Apr 20 21:14:52.827077 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.825947 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/67b17fac-14b3-453d-b0aa-0062a9cf986e-node-exporter-wtmp\") pod \"node-exporter-nmpg7\" (UID: \"67b17fac-14b3-453d-b0aa-0062a9cf986e\") " pod="openshift-monitoring/node-exporter-nmpg7" Apr 20 21:14:52.827077 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.825977 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5xbz\" (UniqueName: \"kubernetes.io/projected/67b17fac-14b3-453d-b0aa-0062a9cf986e-kube-api-access-r5xbz\") pod \"node-exporter-nmpg7\" (UID: \"67b17fac-14b3-453d-b0aa-0062a9cf986e\") " pod="openshift-monitoring/node-exporter-nmpg7" Apr 20 21:14:52.827077 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.826018 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0226bfa3-7ec0-490e-944a-1e60da426ea1-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-826s9\" (UID: \"0226bfa3-7ec0-490e-944a-1e60da426ea1\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-826s9" Apr 20 21:14:52.827077 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.826714 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0226bfa3-7ec0-490e-944a-1e60da426ea1-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-826s9\" (UID: \"0226bfa3-7ec0-490e-944a-1e60da426ea1\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-826s9" Apr 20 21:14:52.829042 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.827792 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/67b17fac-14b3-453d-b0aa-0062a9cf986e-node-exporter-accelerators-collector-config\") pod \"node-exporter-nmpg7\" (UID: \"67b17fac-14b3-453d-b0aa-0062a9cf986e\") " pod="openshift-monitoring/node-exporter-nmpg7" Apr 20 21:14:52.829042 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:52.827880 2571 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 20 21:14:52.829042 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:52.827924 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0226bfa3-7ec0-490e-944a-1e60da426ea1-openshift-state-metrics-tls podName:0226bfa3-7ec0-490e-944a-1e60da426ea1 nodeName:}" failed. No retries permitted until 2026-04-20 21:14:53.327910379 +0000 UTC m=+103.570869491 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/0226bfa3-7ec0-490e-944a-1e60da426ea1-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-826s9" (UID: "0226bfa3-7ec0-490e-944a-1e60da426ea1") : secret "openshift-state-metrics-tls" not found Apr 20 21:14:52.829042 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.828337 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/67b17fac-14b3-453d-b0aa-0062a9cf986e-metrics-client-ca\") pod \"node-exporter-nmpg7\" (UID: \"67b17fac-14b3-453d-b0aa-0062a9cf986e\") " pod="openshift-monitoring/node-exporter-nmpg7" Apr 20 21:14:52.829042 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.828396 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/67b17fac-14b3-453d-b0aa-0062a9cf986e-root\") pod \"node-exporter-nmpg7\" (UID: \"67b17fac-14b3-453d-b0aa-0062a9cf986e\") " pod="openshift-monitoring/node-exporter-nmpg7" Apr 20 21:14:52.829042 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.828508 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/67b17fac-14b3-453d-b0aa-0062a9cf986e-node-exporter-wtmp\") pod \"node-exporter-nmpg7\" (UID: \"67b17fac-14b3-453d-b0aa-0062a9cf986e\") " pod="openshift-monitoring/node-exporter-nmpg7" Apr 20 21:14:52.829042 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.828990 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/67b17fac-14b3-453d-b0aa-0062a9cf986e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nmpg7\" (UID: \"67b17fac-14b3-453d-b0aa-0062a9cf986e\") " pod="openshift-monitoring/node-exporter-nmpg7" Apr 20 21:14:52.834123 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.834085 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwjgf\" (UniqueName: \"kubernetes.io/projected/0226bfa3-7ec0-490e-944a-1e60da426ea1-kube-api-access-pwjgf\") pod \"openshift-state-metrics-9d44df66c-826s9\" (UID: \"0226bfa3-7ec0-490e-944a-1e60da426ea1\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-826s9" Apr 20 21:14:52.841313 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.841270 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0226bfa3-7ec0-490e-944a-1e60da426ea1-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-826s9\" (UID: \"0226bfa3-7ec0-490e-944a-1e60da426ea1\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-826s9" Apr 20 21:14:52.848703 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:52.848663 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5xbz\" (UniqueName: \"kubernetes.io/projected/67b17fac-14b3-453d-b0aa-0062a9cf986e-kube-api-access-r5xbz\") pod \"node-exporter-nmpg7\" (UID: \"67b17fac-14b3-453d-b0aa-0062a9cf986e\") " pod="openshift-monitoring/node-exporter-nmpg7" Apr 20 21:14:53.330611 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:53.330044 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/67b17fac-14b3-453d-b0aa-0062a9cf986e-node-exporter-tls\") pod \"node-exporter-nmpg7\" (UID: \"67b17fac-14b3-453d-b0aa-0062a9cf986e\") " pod="openshift-monitoring/node-exporter-nmpg7" Apr 20 21:14:53.330611 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:53.330141 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0226bfa3-7ec0-490e-944a-1e60da426ea1-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-826s9\" (UID: \"0226bfa3-7ec0-490e-944a-1e60da426ea1\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-826s9" Apr 20 21:14:53.336857 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:53.336826 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/67b17fac-14b3-453d-b0aa-0062a9cf986e-node-exporter-tls\") pod \"node-exporter-nmpg7\" (UID: \"67b17fac-14b3-453d-b0aa-0062a9cf986e\") " pod="openshift-monitoring/node-exporter-nmpg7" Apr 20 21:14:53.340218 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:53.340169 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0226bfa3-7ec0-490e-944a-1e60da426ea1-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-826s9\" (UID: \"0226bfa3-7ec0-490e-944a-1e60da426ea1\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-826s9" Apr 20 21:14:53.533477 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:53.533442 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-826s9" Apr 20 21:14:53.558702 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:53.558322 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-nmpg7" Apr 20 21:14:54.205087 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:14:54.205047 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67b17fac_14b3_453d_b0aa_0062a9cf986e.slice/crio-51169ec5625032239766fd312c73c74d615c8acab0c29fd5f6c1a0c49d3cfac0 WatchSource:0}: Error finding container 51169ec5625032239766fd312c73c74d615c8acab0c29fd5f6c1a0c49d3cfac0: Status 404 returned error can't find the container with id 51169ec5625032239766fd312c73c74d615c8acab0c29fd5f6c1a0c49d3cfac0 Apr 20 21:14:54.344712 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:54.344663 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-826s9"] Apr 20 21:14:54.348572 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:14:54.348542 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0226bfa3_7ec0_490e_944a_1e60da426ea1.slice/crio-fe57d4b07f8acb5732c206686180a271b98eaa803146ec0dff94cd249dc8c62e WatchSource:0}: Error finding container fe57d4b07f8acb5732c206686180a271b98eaa803146ec0dff94cd249dc8c62e: Status 404 returned error can't find the container with id fe57d4b07f8acb5732c206686180a271b98eaa803146ec0dff94cd249dc8c62e Apr 20 21:14:54.542827 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:54.542763 2571 patch_prober.go:28] interesting pod/image-registry-56b9995c8d-m5vt9 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 20 21:14:54.543014 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:54.542825 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-56b9995c8d-m5vt9" podUID="adb3353d-8a6b-4b5d-9dbe-795907ebf77a" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 21:14:54.822448 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:54.822408 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cqwmd" event={"ID":"096e06e8-bf34-462d-9f43-fd87848fd09e","Type":"ContainerStarted","Data":"fa072dbd66759ae115c87ff22009fb5a378c6f747f94c86a37a00c9eee975f7e"} Apr 20 21:14:54.822448 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:54.822454 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cqwmd" event={"ID":"096e06e8-bf34-462d-9f43-fd87848fd09e","Type":"ContainerStarted","Data":"3475310719ba9b0fae23e11a70565caca3da6a803b326ec97cea781a16ffe6a5"} Apr 20 21:14:54.822660 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:54.822538 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-cqwmd" Apr 20 21:14:54.824839 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:54.824809 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-826s9" event={"ID":"0226bfa3-7ec0-490e-944a-1e60da426ea1","Type":"ContainerStarted","Data":"7cd9f5b34d7776bf7935f3d147f8539e118490594a7891e1797d5c5885085182"} Apr 20 21:14:54.824978 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:54.824853 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-826s9" event={"ID":"0226bfa3-7ec0-490e-944a-1e60da426ea1","Type":"ContainerStarted","Data":"e4e69a3f36d81d9f41a62826e0ad55d480526f10e5cab82b19d6be3725b9ad7d"} Apr 20 21:14:54.824978 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:54.824868 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-826s9" event={"ID":"0226bfa3-7ec0-490e-944a-1e60da426ea1","Type":"ContainerStarted","Data":"fe57d4b07f8acb5732c206686180a271b98eaa803146ec0dff94cd249dc8c62e"} Apr 20 21:14:54.826438 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:54.826412 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ld4fl" event={"ID":"08619f25-e76b-45d3-ab4b-8e9490d505f9","Type":"ContainerStarted","Data":"98e9d2bf0f4c06bda2554a34986871e07361a6e35c9d3c43fd7e29853b8a8aeb"} Apr 20 21:14:54.827606 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:54.827569 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nmpg7" event={"ID":"67b17fac-14b3-453d-b0aa-0062a9cf986e","Type":"ContainerStarted","Data":"51169ec5625032239766fd312c73c74d615c8acab0c29fd5f6c1a0c49d3cfac0"} Apr 20 21:14:54.839009 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:54.838970 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-cqwmd" podStartSLOduration=68.875677132 podStartE2EDuration="1m11.838958852s" podCreationTimestamp="2026-04-20 21:13:43 +0000 UTC" firstStartedPulling="2026-04-20 21:14:51.241846998 +0000 UTC m=+101.484806096" lastFinishedPulling="2026-04-20 21:14:54.205128709 +0000 UTC m=+104.448087816" observedRunningTime="2026-04-20 21:14:54.837968372 +0000 UTC m=+105.080927527" watchObservedRunningTime="2026-04-20 21:14:54.838958852 +0000 UTC m=+105.081917971" Apr 20 21:14:54.851969 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:54.851928 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-ld4fl" podStartSLOduration=68.874202041 podStartE2EDuration="1m11.851915908s" podCreationTimestamp="2026-04-20 21:13:43 +0000 UTC" firstStartedPulling="2026-04-20 21:14:51.230342864 +0000 UTC m=+101.473301980" lastFinishedPulling="2026-04-20 21:14:54.208056745 +0000 UTC m=+104.451015847" observedRunningTime="2026-04-20 21:14:54.850811615 +0000 UTC m=+105.093770736" watchObservedRunningTime="2026-04-20 21:14:54.851915908 +0000 UTC m=+105.094875022" Apr 20 21:14:55.831904 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:55.831861 2571 generic.go:358] "Generic (PLEG): container finished" podID="67b17fac-14b3-453d-b0aa-0062a9cf986e" containerID="8649262b4b534b7ca9be7d67a02496a5ad047dae860eea1529ababc943728e9a" exitCode=0 Apr 20 21:14:55.832295 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:55.831949 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nmpg7" event={"ID":"67b17fac-14b3-453d-b0aa-0062a9cf986e","Type":"ContainerDied","Data":"8649262b4b534b7ca9be7d67a02496a5ad047dae860eea1529ababc943728e9a"} Apr 20 21:14:56.837909 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:56.837812 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-826s9" event={"ID":"0226bfa3-7ec0-490e-944a-1e60da426ea1","Type":"ContainerStarted","Data":"271bba84e6001de329efd6bde19d747157519d3e05e31cbfb3b8db5770563b07"} Apr 20 21:14:56.839879 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:56.839847 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nmpg7" event={"ID":"67b17fac-14b3-453d-b0aa-0062a9cf986e","Type":"ContainerStarted","Data":"859efe71010fc58f5e9c1e73b678cf8ba2fed170a18f2f9ce2902cdce5e8c3be"} Apr 20 21:14:56.840003 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:56.839884 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nmpg7" event={"ID":"67b17fac-14b3-453d-b0aa-0062a9cf986e","Type":"ContainerStarted","Data":"b689bdea35224a766e114f4bcecbe5490bd2b900f39e32ad471865df538a1b93"} Apr 20 21:14:56.857113 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:56.857065 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-826s9" podStartSLOduration=2.947820832 podStartE2EDuration="4.85704649s" podCreationTimestamp="2026-04-20 21:14:52 +0000 UTC" firstStartedPulling="2026-04-20 21:14:54.584159385 +0000 UTC m=+104.827118489" lastFinishedPulling="2026-04-20 21:14:56.493385045 +0000 UTC m=+106.736344147" observedRunningTime="2026-04-20 21:14:56.85516635 +0000 UTC m=+107.098125471" watchObservedRunningTime="2026-04-20 21:14:56.85704649 +0000 UTC m=+107.100005614" Apr 20 21:14:56.873201 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:56.873127 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-nmpg7" podStartSLOduration=3.633585663 podStartE2EDuration="4.87311458s" podCreationTimestamp="2026-04-20 21:14:52 +0000 UTC" firstStartedPulling="2026-04-20 21:14:54.206808676 +0000 UTC m=+104.449767860" lastFinishedPulling="2026-04-20 21:14:55.446337667 +0000 UTC m=+105.689296777" observedRunningTime="2026-04-20 21:14:56.872010753 +0000 UTC m=+107.114969875" watchObservedRunningTime="2026-04-20 21:14:56.87311458 +0000 UTC m=+107.116073700" Apr 20 21:14:57.394055 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:57.394017 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-jltn6"] Apr 20 21:14:57.419945 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:57.419907 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-jltn6"] Apr 20 21:14:57.420099 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:57.419970 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jltn6" Apr 20 21:14:57.421906 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:57.421884 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-lqd6b\"" Apr 20 21:14:57.422116 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:57.422103 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 20 21:14:57.465409 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:57.465376 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/01779d30-26d6-410c-b5b1-a6f02ae25857-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-jltn6\" (UID: \"01779d30-26d6-410c-b5b1-a6f02ae25857\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jltn6" Apr 20 21:14:57.566384 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:57.566348 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/01779d30-26d6-410c-b5b1-a6f02ae25857-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-jltn6\" (UID: \"01779d30-26d6-410c-b5b1-a6f02ae25857\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jltn6" Apr 20 21:14:57.566573 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:57.566510 2571 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 20 21:14:57.566646 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:14:57.566589 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01779d30-26d6-410c-b5b1-a6f02ae25857-monitoring-plugin-cert podName:01779d30-26d6-410c-b5b1-a6f02ae25857 nodeName:}" failed. No retries permitted until 2026-04-20 21:14:58.066573828 +0000 UTC m=+108.309532929 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/01779d30-26d6-410c-b5b1-a6f02ae25857-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-jltn6" (UID: "01779d30-26d6-410c-b5b1-a6f02ae25857") : secret "monitoring-plugin-cert" not found Apr 20 21:14:57.759529 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:57.759503 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-56b9995c8d-m5vt9" Apr 20 21:14:58.070445 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:58.070355 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/01779d30-26d6-410c-b5b1-a6f02ae25857-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-jltn6\" (UID: \"01779d30-26d6-410c-b5b1-a6f02ae25857\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jltn6" Apr 20 21:14:58.073329 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:58.073304 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/01779d30-26d6-410c-b5b1-a6f02ae25857-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-jltn6\" (UID: \"01779d30-26d6-410c-b5b1-a6f02ae25857\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jltn6" Apr 20 21:14:58.331359 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:58.331293 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jltn6" Apr 20 21:14:58.471553 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:58.471523 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-jltn6"] Apr 20 21:14:58.475336 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:14:58.475307 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01779d30_26d6_410c_b5b1_a6f02ae25857.slice/crio-c037cbb3d14d60c2e54a690246a3ee12967dfc8778c2f4ff0c09f6c8eccb9a17 WatchSource:0}: Error finding container c037cbb3d14d60c2e54a690246a3ee12967dfc8778c2f4ff0c09f6c8eccb9a17: Status 404 returned error can't find the container with id c037cbb3d14d60c2e54a690246a3ee12967dfc8778c2f4ff0c09f6c8eccb9a17 Apr 20 21:14:58.848242 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:14:58.848207 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jltn6" event={"ID":"01779d30-26d6-410c-b5b1-a6f02ae25857","Type":"ContainerStarted","Data":"c037cbb3d14d60c2e54a690246a3ee12967dfc8778c2f4ff0c09f6c8eccb9a17"} Apr 20 21:15:00.856401 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:15:00.856362 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jltn6" event={"ID":"01779d30-26d6-410c-b5b1-a6f02ae25857","Type":"ContainerStarted","Data":"7b484641021b01a053469c81c130f50e432ee6e4d8a2d15291c7c10f92142f76"} Apr 20 21:15:00.856823 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:15:00.856571 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jltn6" Apr 20 21:15:00.861728 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:15:00.861706 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jltn6" Apr 20 21:15:00.870899 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:15:00.870855 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jltn6" podStartSLOduration=1.8807995979999999 podStartE2EDuration="3.870839336s" podCreationTimestamp="2026-04-20 21:14:57 +0000 UTC" firstStartedPulling="2026-04-20 21:14:58.477566301 +0000 UTC m=+108.720525399" lastFinishedPulling="2026-04-20 21:15:00.467606037 +0000 UTC m=+110.710565137" observedRunningTime="2026-04-20 21:15:00.870040882 +0000 UTC m=+111.113000014" watchObservedRunningTime="2026-04-20 21:15:00.870839336 +0000 UTC m=+111.113798457" Apr 20 21:15:04.834112 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:15:04.834082 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-cqwmd" Apr 20 21:15:20.039590 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:15:20.039553 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89e3c54c-a866-4c9b-940d-54a417b5c964-metrics-certs\") pod \"network-metrics-daemon-fk9cw\" (UID: \"89e3c54c-a866-4c9b-940d-54a417b5c964\") " pod="openshift-multus/network-metrics-daemon-fk9cw" Apr 20 21:15:20.041892 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:15:20.041873 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89e3c54c-a866-4c9b-940d-54a417b5c964-metrics-certs\") pod \"network-metrics-daemon-fk9cw\" (UID: \"89e3c54c-a866-4c9b-940d-54a417b5c964\") " pod="openshift-multus/network-metrics-daemon-fk9cw" Apr 20 21:15:20.258635 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:15:20.258604 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lxd5q\"" Apr 20 21:15:20.267237 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:15:20.267217 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fk9cw" Apr 20 21:15:20.387359 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:15:20.387325 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fk9cw"] Apr 20 21:15:20.390271 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:15:20.390238 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89e3c54c_a866_4c9b_940d_54a417b5c964.slice/crio-a5b898ed2f77806c9f7cf415bf50406f92874e3c71289ca75f744fd516401b77 WatchSource:0}: Error finding container a5b898ed2f77806c9f7cf415bf50406f92874e3c71289ca75f744fd516401b77: Status 404 returned error can't find the container with id a5b898ed2f77806c9f7cf415bf50406f92874e3c71289ca75f744fd516401b77 Apr 20 21:15:20.910789 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:15:20.910751 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fk9cw" event={"ID":"89e3c54c-a866-4c9b-940d-54a417b5c964","Type":"ContainerStarted","Data":"a5b898ed2f77806c9f7cf415bf50406f92874e3c71289ca75f744fd516401b77"} Apr 20 21:15:22.917842 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:15:22.917805 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fk9cw" event={"ID":"89e3c54c-a866-4c9b-940d-54a417b5c964","Type":"ContainerStarted","Data":"b0d56ffd8ebf3cdbaf40f4e92e1bb13d92a345f65cccfae3ee265cda8a49fdb0"} Apr 20 21:15:22.917842 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:15:22.917845 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fk9cw" event={"ID":"89e3c54c-a866-4c9b-940d-54a417b5c964","Type":"ContainerStarted","Data":"3c485c32f4b60cdb4770367e979025f1ad84b0e3dee4001e1a5ad86f36b6ed25"} Apr 20 21:15:22.933268 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:15:22.933220 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-fk9cw" podStartSLOduration=131.274212252 podStartE2EDuration="2m12.933205205s" podCreationTimestamp="2026-04-20 21:13:10 +0000 UTC" firstStartedPulling="2026-04-20 21:15:20.392165191 +0000 UTC m=+130.635124303" lastFinishedPulling="2026-04-20 21:15:22.051158154 +0000 UTC m=+132.294117256" observedRunningTime="2026-04-20 21:15:22.931761621 +0000 UTC m=+133.174720743" watchObservedRunningTime="2026-04-20 21:15:22.933205205 +0000 UTC m=+133.176164322" Apr 20 21:15:32.945713 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:15:32.945670 2571 generic.go:358] "Generic (PLEG): container finished" podID="36c761e2-ffbf-4d69-8d19-9b3793a3acf9" containerID="a172960b27c2f9176eb327dda1829bab5da7c3aeba4c3aec96a1504dc73506bf" exitCode=0 Apr 20 21:15:32.946132 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:15:32.945742 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8gwlz" event={"ID":"36c761e2-ffbf-4d69-8d19-9b3793a3acf9","Type":"ContainerDied","Data":"a172960b27c2f9176eb327dda1829bab5da7c3aeba4c3aec96a1504dc73506bf"} Apr 20 21:15:32.946132 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:15:32.946041 2571 scope.go:117] "RemoveContainer" containerID="a172960b27c2f9176eb327dda1829bab5da7c3aeba4c3aec96a1504dc73506bf" Apr 20 21:15:33.950839 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:15:33.950807 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-8gwlz" event={"ID":"36c761e2-ffbf-4d69-8d19-9b3793a3acf9","Type":"ContainerStarted","Data":"207f484512ee1e78363b3189436b66ea59c60224382086e56b9879e24d02e212"} Apr 20 21:15:42.977196 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:15:42.977146 2571 generic.go:358] "Generic (PLEG): container finished" podID="f6a03f80-2426-4087-b868-a71402310e22" containerID="dbeb3a3f530a63aa7e5769487daa90bac2702c580990af8c88c02164bcc1651d" exitCode=0 Apr 20 21:15:42.977578 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:15:42.977223 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-xxkhk" event={"ID":"f6a03f80-2426-4087-b868-a71402310e22","Type":"ContainerDied","Data":"dbeb3a3f530a63aa7e5769487daa90bac2702c580990af8c88c02164bcc1651d"} Apr 20 21:15:42.977578 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:15:42.977499 2571 scope.go:117] "RemoveContainer" containerID="dbeb3a3f530a63aa7e5769487daa90bac2702c580990af8c88c02164bcc1651d" Apr 20 21:15:43.981869 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:15:43.981832 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-xxkhk" event={"ID":"f6a03f80-2426-4087-b868-a71402310e22","Type":"ContainerStarted","Data":"b53aaec1fcf8140dc328d7ee1c9f085500e5f533871aab4b60e9e34262aa30a4"} Apr 20 21:16:50.329515 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:16:50.329478 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-h84cc"] Apr 20 21:16:50.332962 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:16:50.332933 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h84cc" Apr 20 21:16:50.334894 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:16:50.334876 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 21:16:50.340998 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:16:50.340978 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-h84cc"] Apr 20 21:16:50.397308 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:16:50.397282 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f42114d2-f0ca-4f06-bdf6-49ec62ba06a3-original-pull-secret\") pod \"global-pull-secret-syncer-h84cc\" (UID: \"f42114d2-f0ca-4f06-bdf6-49ec62ba06a3\") " pod="kube-system/global-pull-secret-syncer-h84cc" Apr 20 21:16:50.397428 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:16:50.397411 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f42114d2-f0ca-4f06-bdf6-49ec62ba06a3-dbus\") pod \"global-pull-secret-syncer-h84cc\" (UID: \"f42114d2-f0ca-4f06-bdf6-49ec62ba06a3\") " pod="kube-system/global-pull-secret-syncer-h84cc" Apr 20 21:16:50.397504 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:16:50.397486 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f42114d2-f0ca-4f06-bdf6-49ec62ba06a3-kubelet-config\") pod \"global-pull-secret-syncer-h84cc\" (UID: \"f42114d2-f0ca-4f06-bdf6-49ec62ba06a3\") " pod="kube-system/global-pull-secret-syncer-h84cc" Apr 20 21:16:50.498673 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:16:50.498634 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f42114d2-f0ca-4f06-bdf6-49ec62ba06a3-kubelet-config\") pod \"global-pull-secret-syncer-h84cc\" (UID: \"f42114d2-f0ca-4f06-bdf6-49ec62ba06a3\") " pod="kube-system/global-pull-secret-syncer-h84cc" Apr 20 21:16:50.498782 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:16:50.498707 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f42114d2-f0ca-4f06-bdf6-49ec62ba06a3-original-pull-secret\") pod \"global-pull-secret-syncer-h84cc\" (UID: \"f42114d2-f0ca-4f06-bdf6-49ec62ba06a3\") " pod="kube-system/global-pull-secret-syncer-h84cc" Apr 20 21:16:50.498782 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:16:50.498728 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f42114d2-f0ca-4f06-bdf6-49ec62ba06a3-dbus\") pod \"global-pull-secret-syncer-h84cc\" (UID: \"f42114d2-f0ca-4f06-bdf6-49ec62ba06a3\") " pod="kube-system/global-pull-secret-syncer-h84cc" Apr 20 21:16:50.498782 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:16:50.498754 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f42114d2-f0ca-4f06-bdf6-49ec62ba06a3-kubelet-config\") pod \"global-pull-secret-syncer-h84cc\" (UID: \"f42114d2-f0ca-4f06-bdf6-49ec62ba06a3\") " pod="kube-system/global-pull-secret-syncer-h84cc" Apr 20 21:16:50.498885 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:16:50.498844 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f42114d2-f0ca-4f06-bdf6-49ec62ba06a3-dbus\") pod \"global-pull-secret-syncer-h84cc\" (UID: \"f42114d2-f0ca-4f06-bdf6-49ec62ba06a3\") " pod="kube-system/global-pull-secret-syncer-h84cc" Apr 20 21:16:50.500996 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:16:50.500972 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f42114d2-f0ca-4f06-bdf6-49ec62ba06a3-original-pull-secret\") pod \"global-pull-secret-syncer-h84cc\" (UID: \"f42114d2-f0ca-4f06-bdf6-49ec62ba06a3\") " pod="kube-system/global-pull-secret-syncer-h84cc" Apr 20 21:16:50.642466 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:16:50.642397 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h84cc" Apr 20 21:16:50.760037 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:16:50.760006 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-h84cc"] Apr 20 21:16:50.763050 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:16:50.763018 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf42114d2_f0ca_4f06_bdf6_49ec62ba06a3.slice/crio-2ee147e5d08b0da0ef660dc8ade93821816903a44aa26cdd434b2b1001ccbc72 WatchSource:0}: Error finding container 2ee147e5d08b0da0ef660dc8ade93821816903a44aa26cdd434b2b1001ccbc72: Status 404 returned error can't find the container with id 2ee147e5d08b0da0ef660dc8ade93821816903a44aa26cdd434b2b1001ccbc72 Apr 20 21:16:51.165733 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:16:51.165690 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-h84cc" event={"ID":"f42114d2-f0ca-4f06-bdf6-49ec62ba06a3","Type":"ContainerStarted","Data":"2ee147e5d08b0da0ef660dc8ade93821816903a44aa26cdd434b2b1001ccbc72"} Apr 20 21:16:56.182842 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:16:56.182796 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-h84cc" event={"ID":"f42114d2-f0ca-4f06-bdf6-49ec62ba06a3","Type":"ContainerStarted","Data":"82b13d3e3757e41d6a27e77f57608cbbda61847617ad27d93dc14216551cb19c"} Apr 20 21:16:56.197993 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:16:56.197944 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-h84cc" podStartSLOduration=1.441196506 podStartE2EDuration="6.197929479s" podCreationTimestamp="2026-04-20 21:16:50 +0000 UTC" firstStartedPulling="2026-04-20 21:16:50.764946269 +0000 UTC m=+221.007905370" lastFinishedPulling="2026-04-20 21:16:55.521679242 +0000 UTC m=+225.764638343" observedRunningTime="2026-04-20 21:16:56.196999503 +0000 UTC m=+226.439958623" watchObservedRunningTime="2026-04-20 21:16:56.197929479 +0000 UTC m=+226.440888599" Apr 20 21:17:30.183130 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:30.183043 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-88vsn"] Apr 20 21:17:30.186385 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:30.186363 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-88vsn" Apr 20 21:17:30.188600 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:30.188583 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-zkqfw\"" Apr 20 21:17:30.188679 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:30.188598 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 20 21:17:30.188872 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:30.188858 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 20 21:17:30.197458 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:30.197435 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-88vsn"] Apr 20 21:17:30.307258 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:30.307223 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/244ef8ba-c834-4e23-969b-60c9badb62d3-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-88vsn\" (UID: \"244ef8ba-c834-4e23-969b-60c9badb62d3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-88vsn" Apr 20 21:17:30.307396 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:30.307302 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktb4p\" (UniqueName: \"kubernetes.io/projected/244ef8ba-c834-4e23-969b-60c9badb62d3-kube-api-access-ktb4p\") pod \"cert-manager-operator-controller-manager-54b9655956-88vsn\" (UID: \"244ef8ba-c834-4e23-969b-60c9badb62d3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-88vsn" Apr 20 21:17:30.407948 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:30.407924 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/244ef8ba-c834-4e23-969b-60c9badb62d3-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-88vsn\" (UID: \"244ef8ba-c834-4e23-969b-60c9badb62d3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-88vsn" Apr 20 21:17:30.408082 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:30.407974 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ktb4p\" (UniqueName: \"kubernetes.io/projected/244ef8ba-c834-4e23-969b-60c9badb62d3-kube-api-access-ktb4p\") pod \"cert-manager-operator-controller-manager-54b9655956-88vsn\" (UID: \"244ef8ba-c834-4e23-969b-60c9badb62d3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-88vsn" Apr 20 21:17:30.408312 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:30.408295 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/244ef8ba-c834-4e23-969b-60c9badb62d3-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-88vsn\" (UID: \"244ef8ba-c834-4e23-969b-60c9badb62d3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-88vsn" Apr 20 21:17:30.416540 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:30.416518 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktb4p\" (UniqueName: \"kubernetes.io/projected/244ef8ba-c834-4e23-969b-60c9badb62d3-kube-api-access-ktb4p\") pod \"cert-manager-operator-controller-manager-54b9655956-88vsn\" (UID: \"244ef8ba-c834-4e23-969b-60c9badb62d3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-88vsn" Apr 20 21:17:30.495842 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:30.495822 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-88vsn" Apr 20 21:17:30.620164 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:30.620131 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-88vsn"] Apr 20 21:17:30.623727 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:17:30.623693 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod244ef8ba_c834_4e23_969b_60c9badb62d3.slice/crio-7df5e6cf80720f97a070958329dd790b548ae196d6f6f620c2ebc664ba3d04b2 WatchSource:0}: Error finding container 7df5e6cf80720f97a070958329dd790b548ae196d6f6f620c2ebc664ba3d04b2: Status 404 returned error can't find the container with id 7df5e6cf80720f97a070958329dd790b548ae196d6f6f620c2ebc664ba3d04b2 Apr 20 21:17:31.279639 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:31.279606 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-88vsn" event={"ID":"244ef8ba-c834-4e23-969b-60c9badb62d3","Type":"ContainerStarted","Data":"7df5e6cf80720f97a070958329dd790b548ae196d6f6f620c2ebc664ba3d04b2"} Apr 20 21:17:33.287005 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:33.286975 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-88vsn" event={"ID":"244ef8ba-c834-4e23-969b-60c9badb62d3","Type":"ContainerStarted","Data":"0d98332cae505edf244b9cfa9629b6a014b56b680c9d0cdcf4d0778e3de26fdc"} Apr 20 21:17:33.309732 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:33.309682 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-88vsn" podStartSLOduration=0.829242607 podStartE2EDuration="3.309663692s" podCreationTimestamp="2026-04-20 21:17:30 +0000 UTC" firstStartedPulling="2026-04-20 21:17:30.626474682 +0000 UTC m=+260.869433784" lastFinishedPulling="2026-04-20 21:17:33.10689577 +0000 UTC m=+263.349854869" observedRunningTime="2026-04-20 21:17:33.308311785 +0000 UTC m=+263.551270904" watchObservedRunningTime="2026-04-20 21:17:33.309663692 +0000 UTC m=+263.552622813" Apr 20 21:17:33.983691 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:33.983648 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frrr5l"] Apr 20 21:17:33.988905 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:33.988880 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frrr5l" Apr 20 21:17:33.991091 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:33.991064 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 21:17:33.991376 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:33.991354 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wh4tl\"" Apr 20 21:17:33.991593 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:33.991568 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 21:17:33.995249 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:33.995228 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frrr5l"] Apr 20 21:17:34.138145 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:34.138107 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2d64ee09-e7df-4db5-a641-e2e6d717aef0-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frrr5l\" (UID: \"2d64ee09-e7df-4db5-a641-e2e6d717aef0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frrr5l" Apr 20 21:17:34.138359 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:34.138206 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2d64ee09-e7df-4db5-a641-e2e6d717aef0-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frrr5l\" (UID: \"2d64ee09-e7df-4db5-a641-e2e6d717aef0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frrr5l" Apr 20 21:17:34.138359 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:34.138296 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r658v\" (UniqueName: \"kubernetes.io/projected/2d64ee09-e7df-4db5-a641-e2e6d717aef0-kube-api-access-r658v\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frrr5l\" (UID: \"2d64ee09-e7df-4db5-a641-e2e6d717aef0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frrr5l" Apr 20 21:17:34.239692 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:34.239549 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2d64ee09-e7df-4db5-a641-e2e6d717aef0-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frrr5l\" (UID: \"2d64ee09-e7df-4db5-a641-e2e6d717aef0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frrr5l" Apr 20 21:17:34.239843 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:34.239700 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2d64ee09-e7df-4db5-a641-e2e6d717aef0-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frrr5l\" (UID: \"2d64ee09-e7df-4db5-a641-e2e6d717aef0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frrr5l" Apr 20 21:17:34.239843 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:34.239754 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r658v\" (UniqueName: \"kubernetes.io/projected/2d64ee09-e7df-4db5-a641-e2e6d717aef0-kube-api-access-r658v\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frrr5l\" (UID: \"2d64ee09-e7df-4db5-a641-e2e6d717aef0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frrr5l" Apr 20 21:17:34.239984 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:34.239929 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2d64ee09-e7df-4db5-a641-e2e6d717aef0-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frrr5l\" (UID: \"2d64ee09-e7df-4db5-a641-e2e6d717aef0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frrr5l" Apr 20 21:17:34.240077 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:34.240054 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2d64ee09-e7df-4db5-a641-e2e6d717aef0-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frrr5l\" (UID: \"2d64ee09-e7df-4db5-a641-e2e6d717aef0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frrr5l" Apr 20 21:17:34.248394 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:34.248372 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r658v\" (UniqueName: \"kubernetes.io/projected/2d64ee09-e7df-4db5-a641-e2e6d717aef0-kube-api-access-r658v\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frrr5l\" (UID: \"2d64ee09-e7df-4db5-a641-e2e6d717aef0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frrr5l" Apr 20 21:17:34.298973 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:34.298948 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frrr5l" Apr 20 21:17:34.423938 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:34.423906 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frrr5l"] Apr 20 21:17:34.426700 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:17:34.426665 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d64ee09_e7df_4db5_a641_e2e6d717aef0.slice/crio-84bb8d25282d1ef0c08cf89a16a387458375eb46f7d599fa9c76d4693ee965a3 WatchSource:0}: Error finding container 84bb8d25282d1ef0c08cf89a16a387458375eb46f7d599fa9c76d4693ee965a3: Status 404 returned error can't find the container with id 84bb8d25282d1ef0c08cf89a16a387458375eb46f7d599fa9c76d4693ee965a3 Apr 20 21:17:35.294812 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:35.294763 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frrr5l" event={"ID":"2d64ee09-e7df-4db5-a641-e2e6d717aef0","Type":"ContainerStarted","Data":"84bb8d25282d1ef0c08cf89a16a387458375eb46f7d599fa9c76d4693ee965a3"} Apr 20 21:17:36.476941 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:36.476902 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-2cl66"] Apr 20 21:17:36.480313 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:36.480290 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-2cl66" Apr 20 21:17:36.482666 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:36.482648 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 20 21:17:36.483257 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:36.483212 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-545pm\"" Apr 20 21:17:36.483384 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:36.483264 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 20 21:17:36.487242 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:36.487223 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-2cl66"] Apr 20 21:17:36.661417 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:36.661372 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlxst\" (UniqueName: \"kubernetes.io/projected/3e1a9dc3-2077-49cd-8368-fa9a72c3452c-kube-api-access-jlxst\") pod \"cert-manager-webhook-587ccfb98-2cl66\" (UID: \"3e1a9dc3-2077-49cd-8368-fa9a72c3452c\") " pod="cert-manager/cert-manager-webhook-587ccfb98-2cl66" Apr 20 21:17:36.661588 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:36.661430 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3e1a9dc3-2077-49cd-8368-fa9a72c3452c-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-2cl66\" (UID: \"3e1a9dc3-2077-49cd-8368-fa9a72c3452c\") " pod="cert-manager/cert-manager-webhook-587ccfb98-2cl66" Apr 20 21:17:36.761877 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:36.761840 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3e1a9dc3-2077-49cd-8368-fa9a72c3452c-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-2cl66\" (UID: \"3e1a9dc3-2077-49cd-8368-fa9a72c3452c\") " pod="cert-manager/cert-manager-webhook-587ccfb98-2cl66" Apr 20 21:17:36.762057 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:36.761970 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jlxst\" (UniqueName: \"kubernetes.io/projected/3e1a9dc3-2077-49cd-8368-fa9a72c3452c-kube-api-access-jlxst\") pod \"cert-manager-webhook-587ccfb98-2cl66\" (UID: \"3e1a9dc3-2077-49cd-8368-fa9a72c3452c\") " pod="cert-manager/cert-manager-webhook-587ccfb98-2cl66" Apr 20 21:17:36.770009 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:36.769982 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3e1a9dc3-2077-49cd-8368-fa9a72c3452c-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-2cl66\" (UID: \"3e1a9dc3-2077-49cd-8368-fa9a72c3452c\") " pod="cert-manager/cert-manager-webhook-587ccfb98-2cl66" Apr 20 21:17:36.770225 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:36.770203 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlxst\" (UniqueName: \"kubernetes.io/projected/3e1a9dc3-2077-49cd-8368-fa9a72c3452c-kube-api-access-jlxst\") pod \"cert-manager-webhook-587ccfb98-2cl66\" (UID: \"3e1a9dc3-2077-49cd-8368-fa9a72c3452c\") " pod="cert-manager/cert-manager-webhook-587ccfb98-2cl66" Apr 20 21:17:36.802100 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:36.802068 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-2cl66" Apr 20 21:17:36.937828 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:36.937794 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-2cl66"] Apr 20 21:17:36.940984 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:17:36.940952 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e1a9dc3_2077_49cd_8368_fa9a72c3452c.slice/crio-cc59d5be00cfdc3470f9f0dc92d5daaecd5003a027a345cc6561406d54afc90c WatchSource:0}: Error finding container cc59d5be00cfdc3470f9f0dc92d5daaecd5003a027a345cc6561406d54afc90c: Status 404 returned error can't find the container with id cc59d5be00cfdc3470f9f0dc92d5daaecd5003a027a345cc6561406d54afc90c Apr 20 21:17:37.304118 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:37.304081 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-2cl66" event={"ID":"3e1a9dc3-2077-49cd-8368-fa9a72c3452c","Type":"ContainerStarted","Data":"cc59d5be00cfdc3470f9f0dc92d5daaecd5003a027a345cc6561406d54afc90c"} Apr 20 21:17:40.315412 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:40.315369 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-2cl66" event={"ID":"3e1a9dc3-2077-49cd-8368-fa9a72c3452c","Type":"ContainerStarted","Data":"7cd806ee05ac96d0cbc4e024a88ec609130295d560fa9e5e7b32860a88dedad3"} Apr 20 21:17:40.315884 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:40.315502 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-2cl66" Apr 20 21:17:40.332766 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:40.332716 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-2cl66" podStartSLOduration=1.76831118 podStartE2EDuration="4.332703127s" podCreationTimestamp="2026-04-20 21:17:36 +0000 UTC" firstStartedPulling="2026-04-20 21:17:36.942969887 +0000 UTC m=+267.185928986" lastFinishedPulling="2026-04-20 21:17:39.507361832 +0000 UTC m=+269.750320933" observedRunningTime="2026-04-20 21:17:40.331599617 +0000 UTC m=+270.574558735" watchObservedRunningTime="2026-04-20 21:17:40.332703127 +0000 UTC m=+270.575662246" Apr 20 21:17:42.323673 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:42.323637 2571 generic.go:358] "Generic (PLEG): container finished" podID="2d64ee09-e7df-4db5-a641-e2e6d717aef0" containerID="b77d529e9763f24f0436f16e73818e73e9a39f95177bed03b61e75150a4651ff" exitCode=0 Apr 20 21:17:42.324061 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:42.323724 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frrr5l" event={"ID":"2d64ee09-e7df-4db5-a641-e2e6d717aef0","Type":"ContainerDied","Data":"b77d529e9763f24f0436f16e73818e73e9a39f95177bed03b61e75150a4651ff"} Apr 20 21:17:44.333380 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:44.333333 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frrr5l" event={"ID":"2d64ee09-e7df-4db5-a641-e2e6d717aef0","Type":"ContainerStarted","Data":"b2fdcbcc9c119cf2ee63c4c8d0cb52879cb57019120684e45f9ea5e247cfe3ff"} Apr 20 21:17:45.337772 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:45.337738 2571 generic.go:358] "Generic (PLEG): container finished" podID="2d64ee09-e7df-4db5-a641-e2e6d717aef0" containerID="b2fdcbcc9c119cf2ee63c4c8d0cb52879cb57019120684e45f9ea5e247cfe3ff" exitCode=0 Apr 20 21:17:45.338137 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:45.337807 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frrr5l" event={"ID":"2d64ee09-e7df-4db5-a641-e2e6d717aef0","Type":"ContainerDied","Data":"b2fdcbcc9c119cf2ee63c4c8d0cb52879cb57019120684e45f9ea5e247cfe3ff"} Apr 20 21:17:46.321670 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:46.321638 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-2cl66" Apr 20 21:17:52.362320 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:52.362280 2571 generic.go:358] "Generic (PLEG): container finished" podID="2d64ee09-e7df-4db5-a641-e2e6d717aef0" containerID="47b37b64b3482ca331bbea8f3e77ba0e95bf6b69d5e3c55252d650c55de035c1" exitCode=0 Apr 20 21:17:52.362663 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:52.362360 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frrr5l" event={"ID":"2d64ee09-e7df-4db5-a641-e2e6d717aef0","Type":"ContainerDied","Data":"47b37b64b3482ca331bbea8f3e77ba0e95bf6b69d5e3c55252d650c55de035c1"} Apr 20 21:17:53.485217 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:53.485172 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frrr5l" Apr 20 21:17:53.602949 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:53.602918 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2d64ee09-e7df-4db5-a641-e2e6d717aef0-util\") pod \"2d64ee09-e7df-4db5-a641-e2e6d717aef0\" (UID: \"2d64ee09-e7df-4db5-a641-e2e6d717aef0\") " Apr 20 21:17:53.603121 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:53.602981 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2d64ee09-e7df-4db5-a641-e2e6d717aef0-bundle\") pod \"2d64ee09-e7df-4db5-a641-e2e6d717aef0\" (UID: \"2d64ee09-e7df-4db5-a641-e2e6d717aef0\") " Apr 20 21:17:53.603121 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:53.603014 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r658v\" (UniqueName: \"kubernetes.io/projected/2d64ee09-e7df-4db5-a641-e2e6d717aef0-kube-api-access-r658v\") pod \"2d64ee09-e7df-4db5-a641-e2e6d717aef0\" (UID: \"2d64ee09-e7df-4db5-a641-e2e6d717aef0\") " Apr 20 21:17:53.603419 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:53.603394 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d64ee09-e7df-4db5-a641-e2e6d717aef0-bundle" (OuterVolumeSpecName: "bundle") pod "2d64ee09-e7df-4db5-a641-e2e6d717aef0" (UID: "2d64ee09-e7df-4db5-a641-e2e6d717aef0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:17:53.605305 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:53.605276 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d64ee09-e7df-4db5-a641-e2e6d717aef0-kube-api-access-r658v" (OuterVolumeSpecName: "kube-api-access-r658v") pod "2d64ee09-e7df-4db5-a641-e2e6d717aef0" (UID: "2d64ee09-e7df-4db5-a641-e2e6d717aef0"). InnerVolumeSpecName "kube-api-access-r658v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:17:53.607576 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:53.607546 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d64ee09-e7df-4db5-a641-e2e6d717aef0-util" (OuterVolumeSpecName: "util") pod "2d64ee09-e7df-4db5-a641-e2e6d717aef0" (UID: "2d64ee09-e7df-4db5-a641-e2e6d717aef0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:17:53.703486 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:53.703416 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2d64ee09-e7df-4db5-a641-e2e6d717aef0-bundle\") on node \"ip-10-0-129-149.ec2.internal\" DevicePath \"\"" Apr 20 21:17:53.703486 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:53.703440 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r658v\" (UniqueName: \"kubernetes.io/projected/2d64ee09-e7df-4db5-a641-e2e6d717aef0-kube-api-access-r658v\") on node \"ip-10-0-129-149.ec2.internal\" DevicePath \"\"" Apr 20 21:17:53.703486 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:53.703450 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2d64ee09-e7df-4db5-a641-e2e6d717aef0-util\") on node \"ip-10-0-129-149.ec2.internal\" DevicePath \"\"" Apr 20 21:17:54.368963 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:54.368928 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frrr5l" event={"ID":"2d64ee09-e7df-4db5-a641-e2e6d717aef0","Type":"ContainerDied","Data":"84bb8d25282d1ef0c08cf89a16a387458375eb46f7d599fa9c76d4693ee965a3"} Apr 20 21:17:54.369090 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:54.368973 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84bb8d25282d1ef0c08cf89a16a387458375eb46f7d599fa9c76d4693ee965a3" Apr 20 21:17:54.369090 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:54.368946 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frrr5l" Apr 20 21:17:54.427645 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:54.427617 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-pgvj4"] Apr 20 21:17:54.427893 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:54.427881 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d64ee09-e7df-4db5-a641-e2e6d717aef0" containerName="pull" Apr 20 21:17:54.427943 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:54.427894 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d64ee09-e7df-4db5-a641-e2e6d717aef0" containerName="pull" Apr 20 21:17:54.427943 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:54.427906 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d64ee09-e7df-4db5-a641-e2e6d717aef0" containerName="extract" Apr 20 21:17:54.427943 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:54.427911 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d64ee09-e7df-4db5-a641-e2e6d717aef0" containerName="extract" Apr 20 21:17:54.427943 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:54.427928 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d64ee09-e7df-4db5-a641-e2e6d717aef0" containerName="util" Apr 20 21:17:54.427943 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:54.427933 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d64ee09-e7df-4db5-a641-e2e6d717aef0" containerName="util" Apr 20 21:17:54.428103 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:54.427981 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="2d64ee09-e7df-4db5-a641-e2e6d717aef0" containerName="extract" Apr 20 21:17:54.430823 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:54.430807 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-pgvj4" Apr 20 21:17:54.432866 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:54.432840 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-ll7nh\"" Apr 20 21:17:54.437820 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:54.437800 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-pgvj4"] Apr 20 21:17:54.506883 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:54.506853 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1a6c7712-2178-40f5-a731-ebb77c77f816-bound-sa-token\") pod \"cert-manager-79c8d999ff-pgvj4\" (UID: \"1a6c7712-2178-40f5-a731-ebb77c77f816\") " pod="cert-manager/cert-manager-79c8d999ff-pgvj4" Apr 20 21:17:54.507228 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:54.506891 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhfkp\" (UniqueName: \"kubernetes.io/projected/1a6c7712-2178-40f5-a731-ebb77c77f816-kube-api-access-bhfkp\") pod \"cert-manager-79c8d999ff-pgvj4\" (UID: \"1a6c7712-2178-40f5-a731-ebb77c77f816\") " pod="cert-manager/cert-manager-79c8d999ff-pgvj4" Apr 20 21:17:54.607477 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:54.607418 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1a6c7712-2178-40f5-a731-ebb77c77f816-bound-sa-token\") pod \"cert-manager-79c8d999ff-pgvj4\" (UID: \"1a6c7712-2178-40f5-a731-ebb77c77f816\") " pod="cert-manager/cert-manager-79c8d999ff-pgvj4" Apr 20 21:17:54.607477 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:54.607451 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bhfkp\" (UniqueName: \"kubernetes.io/projected/1a6c7712-2178-40f5-a731-ebb77c77f816-kube-api-access-bhfkp\") pod \"cert-manager-79c8d999ff-pgvj4\" (UID: \"1a6c7712-2178-40f5-a731-ebb77c77f816\") " pod="cert-manager/cert-manager-79c8d999ff-pgvj4" Apr 20 21:17:54.615677 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:54.615655 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1a6c7712-2178-40f5-a731-ebb77c77f816-bound-sa-token\") pod \"cert-manager-79c8d999ff-pgvj4\" (UID: \"1a6c7712-2178-40f5-a731-ebb77c77f816\") " pod="cert-manager/cert-manager-79c8d999ff-pgvj4" Apr 20 21:17:54.615777 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:54.615715 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhfkp\" (UniqueName: \"kubernetes.io/projected/1a6c7712-2178-40f5-a731-ebb77c77f816-kube-api-access-bhfkp\") pod \"cert-manager-79c8d999ff-pgvj4\" (UID: \"1a6c7712-2178-40f5-a731-ebb77c77f816\") " pod="cert-manager/cert-manager-79c8d999ff-pgvj4" Apr 20 21:17:54.739460 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:54.739438 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-pgvj4" Apr 20 21:17:54.856462 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:54.856431 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-pgvj4"] Apr 20 21:17:54.859359 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:17:54.859304 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a6c7712_2178_40f5_a731_ebb77c77f816.slice/crio-dd2a108392844b751f1f9560bc31c078a88abb748a63f9b7dad9e22afab90520 WatchSource:0}: Error finding container dd2a108392844b751f1f9560bc31c078a88abb748a63f9b7dad9e22afab90520: Status 404 returned error can't find the container with id dd2a108392844b751f1f9560bc31c078a88abb748a63f9b7dad9e22afab90520 Apr 20 21:17:55.373659 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:55.373625 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-pgvj4" event={"ID":"1a6c7712-2178-40f5-a731-ebb77c77f816","Type":"ContainerStarted","Data":"75a43e4f908442264981ce848c327fd7b7e3ab6df3cd29eb407c2f80d8f1aa83"} Apr 20 21:17:55.373659 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:55.373657 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-pgvj4" event={"ID":"1a6c7712-2178-40f5-a731-ebb77c77f816","Type":"ContainerStarted","Data":"dd2a108392844b751f1f9560bc31c078a88abb748a63f9b7dad9e22afab90520"} Apr 20 21:17:55.402423 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:17:55.402377 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-pgvj4" podStartSLOduration=1.402363894 podStartE2EDuration="1.402363894s" podCreationTimestamp="2026-04-20 21:17:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:17:55.400246474 +0000 UTC m=+285.643205593" watchObservedRunningTime="2026-04-20 21:17:55.402363894 +0000 UTC m=+285.645323013" Apr 20 21:18:10.204626 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:10.204595 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hjvgw_838a3bd4-1a50-4127-a629-525bfede6ffd/console-operator/1.log" Apr 20 21:18:10.205096 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:10.204981 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hjvgw_838a3bd4-1a50-4127-a629-525bfede6ffd/console-operator/1.log" Apr 20 21:18:10.210095 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:10.210075 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6skp_82c75868-1659-4814-b726-ba733f5f2ebc/ovn-acl-logging/0.log" Apr 20 21:18:10.210282 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:10.210264 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6skp_82c75868-1659-4814-b726-ba733f5f2ebc/ovn-acl-logging/0.log" Apr 20 21:18:10.214589 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:10.214570 2571 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 21:18:18.102775 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.102737 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-85fc55dd88-5pr8n"] Apr 20 21:18:18.106029 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.106010 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-5pr8n" Apr 20 21:18:18.108474 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.108456 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-hgbvg\"" Apr 20 21:18:18.108570 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.108518 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 21:18:18.108632 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.108521 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 21:18:18.108693 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.108674 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 21:18:18.108920 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.108899 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 21:18:18.122576 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.122555 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-85fc55dd88-5pr8n"] Apr 20 21:18:18.169093 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.169056 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr45w\" (UniqueName: \"kubernetes.io/projected/9b6b2586-750e-4c97-bd26-6712ca103c6d-kube-api-access-mr45w\") pod \"opendatahub-operator-controller-manager-85fc55dd88-5pr8n\" (UID: \"9b6b2586-750e-4c97-bd26-6712ca103c6d\") " pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-5pr8n" Apr 20 21:18:18.169093 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.169098 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9b6b2586-750e-4c97-bd26-6712ca103c6d-webhook-cert\") pod \"opendatahub-operator-controller-manager-85fc55dd88-5pr8n\" (UID: \"9b6b2586-750e-4c97-bd26-6712ca103c6d\") " pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-5pr8n" Apr 20 21:18:18.169328 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.169121 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9b6b2586-750e-4c97-bd26-6712ca103c6d-apiservice-cert\") pod \"opendatahub-operator-controller-manager-85fc55dd88-5pr8n\" (UID: \"9b6b2586-750e-4c97-bd26-6712ca103c6d\") " pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-5pr8n" Apr 20 21:18:18.270555 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.270500 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mr45w\" (UniqueName: \"kubernetes.io/projected/9b6b2586-750e-4c97-bd26-6712ca103c6d-kube-api-access-mr45w\") pod \"opendatahub-operator-controller-manager-85fc55dd88-5pr8n\" (UID: \"9b6b2586-750e-4c97-bd26-6712ca103c6d\") " pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-5pr8n" Apr 20 21:18:18.270753 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.270564 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9b6b2586-750e-4c97-bd26-6712ca103c6d-webhook-cert\") pod \"opendatahub-operator-controller-manager-85fc55dd88-5pr8n\" (UID: \"9b6b2586-750e-4c97-bd26-6712ca103c6d\") " pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-5pr8n" Apr 20 21:18:18.270753 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.270606 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9b6b2586-750e-4c97-bd26-6712ca103c6d-apiservice-cert\") pod \"opendatahub-operator-controller-manager-85fc55dd88-5pr8n\" (UID: \"9b6b2586-750e-4c97-bd26-6712ca103c6d\") " pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-5pr8n" Apr 20 21:18:18.273115 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.273086 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9b6b2586-750e-4c97-bd26-6712ca103c6d-webhook-cert\") pod \"opendatahub-operator-controller-manager-85fc55dd88-5pr8n\" (UID: \"9b6b2586-750e-4c97-bd26-6712ca103c6d\") " pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-5pr8n" Apr 20 21:18:18.273245 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.273121 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9b6b2586-750e-4c97-bd26-6712ca103c6d-apiservice-cert\") pod \"opendatahub-operator-controller-manager-85fc55dd88-5pr8n\" (UID: \"9b6b2586-750e-4c97-bd26-6712ca103c6d\") " pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-5pr8n" Apr 20 21:18:18.286927 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.286901 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr45w\" (UniqueName: \"kubernetes.io/projected/9b6b2586-750e-4c97-bd26-6712ca103c6d-kube-api-access-mr45w\") pod \"opendatahub-operator-controller-manager-85fc55dd88-5pr8n\" (UID: \"9b6b2586-750e-4c97-bd26-6712ca103c6d\") " pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-5pr8n" Apr 20 21:18:18.416569 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.416495 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-5pr8n" Apr 20 21:18:18.555790 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.555750 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-85fc55dd88-5pr8n"] Apr 20 21:18:18.559111 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:18:18.559085 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b6b2586_750e_4c97_bd26_6712ca103c6d.slice/crio-56e5a76a9e0cf1a8cedd48d144f189f656ae4c2ea2ca1fca0bdec804868040c3 WatchSource:0}: Error finding container 56e5a76a9e0cf1a8cedd48d144f189f656ae4c2ea2ca1fca0bdec804868040c3: Status 404 returned error can't find the container with id 56e5a76a9e0cf1a8cedd48d144f189f656ae4c2ea2ca1fca0bdec804868040c3 Apr 20 21:18:18.560704 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.560687 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 21:18:18.622150 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.622118 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-56b87855f9-7pkwp"] Apr 20 21:18:18.626520 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.626503 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-56b87855f9-7pkwp" Apr 20 21:18:18.628672 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.628650 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 21:18:18.628814 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.628797 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 20 21:18:18.628814 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.628808 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 20 21:18:18.628935 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.628809 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 20 21:18:18.628977 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.628963 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 21:18:18.629045 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.629027 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-sgxkp\"" Apr 20 21:18:18.632746 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.632723 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-56b87855f9-7pkwp"] Apr 20 21:18:18.674161 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.674072 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/9bb6896d-7f01-46d8-9997-83170aa22077-metrics-cert\") pod \"lws-controller-manager-56b87855f9-7pkwp\" (UID: \"9bb6896d-7f01-46d8-9997-83170aa22077\") " pod="openshift-lws-operator/lws-controller-manager-56b87855f9-7pkwp" Apr 20 21:18:18.674161 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.674112 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n5rb\" (UniqueName: \"kubernetes.io/projected/9bb6896d-7f01-46d8-9997-83170aa22077-kube-api-access-2n5rb\") pod \"lws-controller-manager-56b87855f9-7pkwp\" (UID: \"9bb6896d-7f01-46d8-9997-83170aa22077\") " pod="openshift-lws-operator/lws-controller-manager-56b87855f9-7pkwp" Apr 20 21:18:18.674380 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.674197 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/9bb6896d-7f01-46d8-9997-83170aa22077-manager-config\") pod \"lws-controller-manager-56b87855f9-7pkwp\" (UID: \"9bb6896d-7f01-46d8-9997-83170aa22077\") " pod="openshift-lws-operator/lws-controller-manager-56b87855f9-7pkwp" Apr 20 21:18:18.674380 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.674285 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9bb6896d-7f01-46d8-9997-83170aa22077-cert\") pod \"lws-controller-manager-56b87855f9-7pkwp\" (UID: \"9bb6896d-7f01-46d8-9997-83170aa22077\") " pod="openshift-lws-operator/lws-controller-manager-56b87855f9-7pkwp" Apr 20 21:18:18.775573 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.775535 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9bb6896d-7f01-46d8-9997-83170aa22077-cert\") pod \"lws-controller-manager-56b87855f9-7pkwp\" (UID: \"9bb6896d-7f01-46d8-9997-83170aa22077\") " pod="openshift-lws-operator/lws-controller-manager-56b87855f9-7pkwp" Apr 20 21:18:18.775726 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.775595 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/9bb6896d-7f01-46d8-9997-83170aa22077-metrics-cert\") pod \"lws-controller-manager-56b87855f9-7pkwp\" (UID: \"9bb6896d-7f01-46d8-9997-83170aa22077\") " pod="openshift-lws-operator/lws-controller-manager-56b87855f9-7pkwp" Apr 20 21:18:18.775726 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.775623 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2n5rb\" (UniqueName: \"kubernetes.io/projected/9bb6896d-7f01-46d8-9997-83170aa22077-kube-api-access-2n5rb\") pod \"lws-controller-manager-56b87855f9-7pkwp\" (UID: \"9bb6896d-7f01-46d8-9997-83170aa22077\") " pod="openshift-lws-operator/lws-controller-manager-56b87855f9-7pkwp" Apr 20 21:18:18.775726 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.775659 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/9bb6896d-7f01-46d8-9997-83170aa22077-manager-config\") pod \"lws-controller-manager-56b87855f9-7pkwp\" (UID: \"9bb6896d-7f01-46d8-9997-83170aa22077\") " pod="openshift-lws-operator/lws-controller-manager-56b87855f9-7pkwp" Apr 20 21:18:18.776329 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.776305 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/9bb6896d-7f01-46d8-9997-83170aa22077-manager-config\") pod \"lws-controller-manager-56b87855f9-7pkwp\" (UID: \"9bb6896d-7f01-46d8-9997-83170aa22077\") " pod="openshift-lws-operator/lws-controller-manager-56b87855f9-7pkwp" Apr 20 21:18:18.778135 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.778116 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9bb6896d-7f01-46d8-9997-83170aa22077-cert\") pod \"lws-controller-manager-56b87855f9-7pkwp\" (UID: \"9bb6896d-7f01-46d8-9997-83170aa22077\") " pod="openshift-lws-operator/lws-controller-manager-56b87855f9-7pkwp" Apr 20 21:18:18.778223 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.778170 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/9bb6896d-7f01-46d8-9997-83170aa22077-metrics-cert\") pod \"lws-controller-manager-56b87855f9-7pkwp\" (UID: \"9bb6896d-7f01-46d8-9997-83170aa22077\") " pod="openshift-lws-operator/lws-controller-manager-56b87855f9-7pkwp" Apr 20 21:18:18.787649 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.787620 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n5rb\" (UniqueName: \"kubernetes.io/projected/9bb6896d-7f01-46d8-9997-83170aa22077-kube-api-access-2n5rb\") pod \"lws-controller-manager-56b87855f9-7pkwp\" (UID: \"9bb6896d-7f01-46d8-9997-83170aa22077\") " pod="openshift-lws-operator/lws-controller-manager-56b87855f9-7pkwp" Apr 20 21:18:18.937573 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:18.937470 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-56b87855f9-7pkwp" Apr 20 21:18:19.063973 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:19.063851 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-56b87855f9-7pkwp"] Apr 20 21:18:19.067279 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:18:19.067231 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bb6896d_7f01_46d8_9997_83170aa22077.slice/crio-69e01c714cbb33f74f0573ffa3641a9f8fa89e093ba298de4251d8d5ab8fb3f2 WatchSource:0}: Error finding container 69e01c714cbb33f74f0573ffa3641a9f8fa89e093ba298de4251d8d5ab8fb3f2: Status 404 returned error can't find the container with id 69e01c714cbb33f74f0573ffa3641a9f8fa89e093ba298de4251d8d5ab8fb3f2 Apr 20 21:18:19.446244 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:19.446202 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-56b87855f9-7pkwp" event={"ID":"9bb6896d-7f01-46d8-9997-83170aa22077","Type":"ContainerStarted","Data":"69e01c714cbb33f74f0573ffa3641a9f8fa89e093ba298de4251d8d5ab8fb3f2"} Apr 20 21:18:19.447614 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:19.447581 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-5pr8n" event={"ID":"9b6b2586-750e-4c97-bd26-6712ca103c6d","Type":"ContainerStarted","Data":"56e5a76a9e0cf1a8cedd48d144f189f656ae4c2ea2ca1fca0bdec804868040c3"} Apr 20 21:18:22.459659 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:22.459626 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-56b87855f9-7pkwp" event={"ID":"9bb6896d-7f01-46d8-9997-83170aa22077","Type":"ContainerStarted","Data":"6718e029b4adde465a62048b6b69d7a699e64b1d44c67c0f5c3c36e843308b97"} Apr 20 21:18:22.460137 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:22.459741 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-56b87855f9-7pkwp" Apr 20 21:18:22.460966 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:22.460945 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-5pr8n" event={"ID":"9b6b2586-750e-4c97-bd26-6712ca103c6d","Type":"ContainerStarted","Data":"c3a691fad08849d32b2d9a2c8c8049d8d00da40ed096f0a25b56a52fb1382415"} Apr 20 21:18:22.461081 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:22.461071 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-5pr8n" Apr 20 21:18:22.476975 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:22.476923 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-56b87855f9-7pkwp" podStartSLOduration=2.039469998 podStartE2EDuration="4.476910916s" podCreationTimestamp="2026-04-20 21:18:18 +0000 UTC" firstStartedPulling="2026-04-20 21:18:19.069605484 +0000 UTC m=+309.312564582" lastFinishedPulling="2026-04-20 21:18:21.50704639 +0000 UTC m=+311.750005500" observedRunningTime="2026-04-20 21:18:22.475093214 +0000 UTC m=+312.718052334" watchObservedRunningTime="2026-04-20 21:18:22.476910916 +0000 UTC m=+312.719870109" Apr 20 21:18:22.500259 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:22.500215 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-5pr8n" podStartSLOduration=1.553791596 podStartE2EDuration="4.500203828s" podCreationTimestamp="2026-04-20 21:18:18 +0000 UTC" firstStartedPulling="2026-04-20 21:18:18.560805882 +0000 UTC m=+308.803764983" lastFinishedPulling="2026-04-20 21:18:21.507218111 +0000 UTC m=+311.750177215" observedRunningTime="2026-04-20 21:18:22.498581183 +0000 UTC m=+312.741540302" watchObservedRunningTime="2026-04-20 21:18:22.500203828 +0000 UTC m=+312.743162945" Apr 20 21:18:33.466869 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:33.466837 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-56b87855f9-7pkwp" Apr 20 21:18:33.467275 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:18:33.466890 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-85fc55dd88-5pr8n" Apr 20 21:19:05.905000 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:05.904916 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw"] Apr 20 21:19:05.908390 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:05.908366 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw" Apr 20 21:19:05.910641 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:05.910614 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-pxh4g\"" Apr 20 21:19:05.910778 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:05.910626 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 20 21:19:05.918726 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:05.918674 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw"] Apr 20 21:19:05.949564 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:05.949537 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzmp4\" (UniqueName: \"kubernetes.io/projected/cfc0536e-7bf6-4f10-ab67-6859ef69f0be-kube-api-access-lzmp4\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw\" (UID: \"cfc0536e-7bf6-4f10-ab67-6859ef69f0be\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw" Apr 20 21:19:05.949725 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:05.949591 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/cfc0536e-7bf6-4f10-ab67-6859ef69f0be-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw\" (UID: \"cfc0536e-7bf6-4f10-ab67-6859ef69f0be\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw" Apr 20 21:19:05.949725 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:05.949672 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cfc0536e-7bf6-4f10-ab67-6859ef69f0be-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw\" (UID: \"cfc0536e-7bf6-4f10-ab67-6859ef69f0be\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw" Apr 20 21:19:05.949725 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:05.949712 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/cfc0536e-7bf6-4f10-ab67-6859ef69f0be-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw\" (UID: \"cfc0536e-7bf6-4f10-ab67-6859ef69f0be\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw" Apr 20 21:19:05.949885 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:05.949742 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/cfc0536e-7bf6-4f10-ab67-6859ef69f0be-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw\" (UID: \"cfc0536e-7bf6-4f10-ab67-6859ef69f0be\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw" Apr 20 21:19:05.949885 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:05.949770 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/cfc0536e-7bf6-4f10-ab67-6859ef69f0be-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw\" (UID: \"cfc0536e-7bf6-4f10-ab67-6859ef69f0be\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw" Apr 20 21:19:05.949885 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:05.949806 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/cfc0536e-7bf6-4f10-ab67-6859ef69f0be-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw\" (UID: \"cfc0536e-7bf6-4f10-ab67-6859ef69f0be\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw" Apr 20 21:19:05.949885 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:05.949848 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/cfc0536e-7bf6-4f10-ab67-6859ef69f0be-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw\" (UID: \"cfc0536e-7bf6-4f10-ab67-6859ef69f0be\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw" Apr 20 21:19:05.950040 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:05.949901 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/cfc0536e-7bf6-4f10-ab67-6859ef69f0be-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw\" (UID: \"cfc0536e-7bf6-4f10-ab67-6859ef69f0be\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw" Apr 20 21:19:06.050715 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:06.050679 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/cfc0536e-7bf6-4f10-ab67-6859ef69f0be-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw\" (UID: \"cfc0536e-7bf6-4f10-ab67-6859ef69f0be\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw" Apr 20 21:19:06.050883 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:06.050728 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lzmp4\" (UniqueName: \"kubernetes.io/projected/cfc0536e-7bf6-4f10-ab67-6859ef69f0be-kube-api-access-lzmp4\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw\" (UID: \"cfc0536e-7bf6-4f10-ab67-6859ef69f0be\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw" Apr 20 21:19:06.050883 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:06.050772 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/cfc0536e-7bf6-4f10-ab67-6859ef69f0be-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw\" (UID: \"cfc0536e-7bf6-4f10-ab67-6859ef69f0be\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw" Apr 20 21:19:06.050883 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:06.050815 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cfc0536e-7bf6-4f10-ab67-6859ef69f0be-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw\" (UID: \"cfc0536e-7bf6-4f10-ab67-6859ef69f0be\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw" Apr 20 21:19:06.050883 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:06.050846 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/cfc0536e-7bf6-4f10-ab67-6859ef69f0be-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw\" (UID: \"cfc0536e-7bf6-4f10-ab67-6859ef69f0be\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw" Apr 20 21:19:06.050883 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:06.050871 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/cfc0536e-7bf6-4f10-ab67-6859ef69f0be-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw\" (UID: \"cfc0536e-7bf6-4f10-ab67-6859ef69f0be\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw" Apr 20 21:19:06.051120 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:06.051017 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/cfc0536e-7bf6-4f10-ab67-6859ef69f0be-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw\" (UID: \"cfc0536e-7bf6-4f10-ab67-6859ef69f0be\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw" Apr 20 21:19:06.051120 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:06.051062 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/cfc0536e-7bf6-4f10-ab67-6859ef69f0be-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw\" (UID: \"cfc0536e-7bf6-4f10-ab67-6859ef69f0be\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw" Apr 20 21:19:06.051120 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:06.051109 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/cfc0536e-7bf6-4f10-ab67-6859ef69f0be-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw\" (UID: \"cfc0536e-7bf6-4f10-ab67-6859ef69f0be\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw" Apr 20 21:19:06.051507 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:06.051268 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/cfc0536e-7bf6-4f10-ab67-6859ef69f0be-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw\" (UID: \"cfc0536e-7bf6-4f10-ab67-6859ef69f0be\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw" Apr 20 21:19:06.051507 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:06.051420 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/cfc0536e-7bf6-4f10-ab67-6859ef69f0be-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw\" (UID: \"cfc0536e-7bf6-4f10-ab67-6859ef69f0be\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw" Apr 20 21:19:06.051507 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:06.051433 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/cfc0536e-7bf6-4f10-ab67-6859ef69f0be-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw\" (UID: \"cfc0536e-7bf6-4f10-ab67-6859ef69f0be\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw" Apr 20 21:19:06.051741 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:06.051597 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/cfc0536e-7bf6-4f10-ab67-6859ef69f0be-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw\" (UID: \"cfc0536e-7bf6-4f10-ab67-6859ef69f0be\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw" Apr 20 21:19:06.051796 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:06.051777 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/cfc0536e-7bf6-4f10-ab67-6859ef69f0be-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw\" (UID: \"cfc0536e-7bf6-4f10-ab67-6859ef69f0be\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw" Apr 20 21:19:06.053526 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:06.053504 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/cfc0536e-7bf6-4f10-ab67-6859ef69f0be-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw\" (UID: \"cfc0536e-7bf6-4f10-ab67-6859ef69f0be\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw" Apr 20 21:19:06.053654 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:06.053602 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cfc0536e-7bf6-4f10-ab67-6859ef69f0be-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw\" (UID: \"cfc0536e-7bf6-4f10-ab67-6859ef69f0be\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw" Apr 20 21:19:06.061093 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:06.061068 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzmp4\" (UniqueName: \"kubernetes.io/projected/cfc0536e-7bf6-4f10-ab67-6859ef69f0be-kube-api-access-lzmp4\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw\" (UID: \"cfc0536e-7bf6-4f10-ab67-6859ef69f0be\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw" Apr 20 21:19:06.063746 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:06.063712 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/cfc0536e-7bf6-4f10-ab67-6859ef69f0be-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw\" (UID: \"cfc0536e-7bf6-4f10-ab67-6859ef69f0be\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw" Apr 20 21:19:06.221950 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:06.221875 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw" Apr 20 21:19:06.349674 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:06.349645 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw"] Apr 20 21:19:06.353003 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:19:06.352964 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfc0536e_7bf6_4f10_ab67_6859ef69f0be.slice/crio-510e40180efeeb2869b0a98630107d88d49e278b46d5c1b6bd53cb82cd3b426f WatchSource:0}: Error finding container 510e40180efeeb2869b0a98630107d88d49e278b46d5c1b6bd53cb82cd3b426f: Status 404 returned error can't find the container with id 510e40180efeeb2869b0a98630107d88d49e278b46d5c1b6bd53cb82cd3b426f Apr 20 21:19:06.610205 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:06.610154 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw" event={"ID":"cfc0536e-7bf6-4f10-ab67-6859ef69f0be","Type":"ContainerStarted","Data":"510e40180efeeb2869b0a98630107d88d49e278b46d5c1b6bd53cb82cd3b426f"} Apr 20 21:19:09.105514 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:09.105474 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 20 21:19:09.105793 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:09.105545 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 20 21:19:09.105793 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:09.105572 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 20 21:19:09.620835 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:09.620799 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw" event={"ID":"cfc0536e-7bf6-4f10-ab67-6859ef69f0be","Type":"ContainerStarted","Data":"7bf9ed77fb46313ead94e92e09b588720bc890b400c312807a984e4be2044c6e"} Apr 20 21:19:09.640419 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:09.640361 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw" podStartSLOduration=1.890067289 podStartE2EDuration="4.640342198s" podCreationTimestamp="2026-04-20 21:19:05 +0000 UTC" firstStartedPulling="2026-04-20 21:19:06.354951702 +0000 UTC m=+356.597910801" lastFinishedPulling="2026-04-20 21:19:09.105226609 +0000 UTC m=+359.348185710" observedRunningTime="2026-04-20 21:19:09.639715183 +0000 UTC m=+359.882674304" watchObservedRunningTime="2026-04-20 21:19:09.640342198 +0000 UTC m=+359.883301318" Apr 20 21:19:10.222392 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:10.222366 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw" Apr 20 21:19:10.226933 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:10.226909 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw" Apr 20 21:19:10.624534 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:10.624502 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw" Apr 20 21:19:10.625509 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:10.625488 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw" Apr 20 21:19:37.297510 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:37.297475 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-w52np"] Apr 20 21:19:37.300676 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:37.300660 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-w52np" Apr 20 21:19:37.302927 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:37.302901 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-rddfd\"" Apr 20 21:19:37.303043 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:37.302926 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 21:19:37.303043 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:37.302965 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 21:19:37.308278 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:37.308255 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-w52np"] Apr 20 21:19:37.428284 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:37.428234 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7q2h\" (UniqueName: \"kubernetes.io/projected/82685481-131d-4d3f-82b3-330bdfd87b40-kube-api-access-h7q2h\") pod \"kuadrant-operator-catalog-w52np\" (UID: \"82685481-131d-4d3f-82b3-330bdfd87b40\") " pod="kuadrant-system/kuadrant-operator-catalog-w52np" Apr 20 21:19:37.529615 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:37.529579 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h7q2h\" (UniqueName: \"kubernetes.io/projected/82685481-131d-4d3f-82b3-330bdfd87b40-kube-api-access-h7q2h\") pod \"kuadrant-operator-catalog-w52np\" (UID: \"82685481-131d-4d3f-82b3-330bdfd87b40\") " pod="kuadrant-system/kuadrant-operator-catalog-w52np" Apr 20 21:19:37.538182 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:37.538154 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7q2h\" (UniqueName: \"kubernetes.io/projected/82685481-131d-4d3f-82b3-330bdfd87b40-kube-api-access-h7q2h\") pod \"kuadrant-operator-catalog-w52np\" (UID: \"82685481-131d-4d3f-82b3-330bdfd87b40\") " pod="kuadrant-system/kuadrant-operator-catalog-w52np" Apr 20 21:19:37.611067 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:37.610983 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-w52np" Apr 20 21:19:37.863254 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:37.863143 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-w52np"] Apr 20 21:19:37.952519 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:37.952496 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-w52np"] Apr 20 21:19:37.954592 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:19:37.954564 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82685481_131d_4d3f_82b3_330bdfd87b40.slice/crio-7fd05fdc6862b8b90d66c6b952da65575c9d721b6a392a5925b729d858981504 WatchSource:0}: Error finding container 7fd05fdc6862b8b90d66c6b952da65575c9d721b6a392a5925b729d858981504: Status 404 returned error can't find the container with id 7fd05fdc6862b8b90d66c6b952da65575c9d721b6a392a5925b729d858981504 Apr 20 21:19:38.717374 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:38.717339 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-w52np" event={"ID":"82685481-131d-4d3f-82b3-330bdfd87b40","Type":"ContainerStarted","Data":"7fd05fdc6862b8b90d66c6b952da65575c9d721b6a392a5925b729d858981504"} Apr 20 21:19:40.728149 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:40.728110 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-w52np" event={"ID":"82685481-131d-4d3f-82b3-330bdfd87b40","Type":"ContainerStarted","Data":"999bd0c9331edc94dad0eeb687b0ea5614471e4c741f9b031b5a5b91de018967"} Apr 20 21:19:40.728580 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:40.728231 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-w52np" podUID="82685481-131d-4d3f-82b3-330bdfd87b40" containerName="registry-server" containerID="cri-o://999bd0c9331edc94dad0eeb687b0ea5614471e4c741f9b031b5a5b91de018967" gracePeriod=2 Apr 20 21:19:40.743746 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:40.743698 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-w52np" podStartSLOduration=1.473315803 podStartE2EDuration="3.743679543s" podCreationTimestamp="2026-04-20 21:19:37 +0000 UTC" firstStartedPulling="2026-04-20 21:19:37.955855779 +0000 UTC m=+388.198814879" lastFinishedPulling="2026-04-20 21:19:40.226219504 +0000 UTC m=+390.469178619" observedRunningTime="2026-04-20 21:19:40.742636186 +0000 UTC m=+390.985595306" watchObservedRunningTime="2026-04-20 21:19:40.743679543 +0000 UTC m=+390.986638664" Apr 20 21:19:40.969893 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:40.969870 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-w52np" Apr 20 21:19:41.062570 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:41.062531 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7q2h\" (UniqueName: \"kubernetes.io/projected/82685481-131d-4d3f-82b3-330bdfd87b40-kube-api-access-h7q2h\") pod \"82685481-131d-4d3f-82b3-330bdfd87b40\" (UID: \"82685481-131d-4d3f-82b3-330bdfd87b40\") " Apr 20 21:19:41.064794 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:41.064770 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82685481-131d-4d3f-82b3-330bdfd87b40-kube-api-access-h7q2h" (OuterVolumeSpecName: "kube-api-access-h7q2h") pod "82685481-131d-4d3f-82b3-330bdfd87b40" (UID: "82685481-131d-4d3f-82b3-330bdfd87b40"). InnerVolumeSpecName "kube-api-access-h7q2h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:19:41.163619 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:41.163582 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h7q2h\" (UniqueName: \"kubernetes.io/projected/82685481-131d-4d3f-82b3-330bdfd87b40-kube-api-access-h7q2h\") on node \"ip-10-0-129-149.ec2.internal\" DevicePath \"\"" Apr 20 21:19:41.732434 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:41.732398 2571 generic.go:358] "Generic (PLEG): container finished" podID="82685481-131d-4d3f-82b3-330bdfd87b40" containerID="999bd0c9331edc94dad0eeb687b0ea5614471e4c741f9b031b5a5b91de018967" exitCode=0 Apr 20 21:19:41.732880 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:41.732463 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-w52np" Apr 20 21:19:41.732880 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:41.732471 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-w52np" event={"ID":"82685481-131d-4d3f-82b3-330bdfd87b40","Type":"ContainerDied","Data":"999bd0c9331edc94dad0eeb687b0ea5614471e4c741f9b031b5a5b91de018967"} Apr 20 21:19:41.732880 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:41.732522 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-w52np" event={"ID":"82685481-131d-4d3f-82b3-330bdfd87b40","Type":"ContainerDied","Data":"7fd05fdc6862b8b90d66c6b952da65575c9d721b6a392a5925b729d858981504"} Apr 20 21:19:41.732880 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:41.732545 2571 scope.go:117] "RemoveContainer" containerID="999bd0c9331edc94dad0eeb687b0ea5614471e4c741f9b031b5a5b91de018967" Apr 20 21:19:41.741430 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:41.741414 2571 scope.go:117] "RemoveContainer" containerID="999bd0c9331edc94dad0eeb687b0ea5614471e4c741f9b031b5a5b91de018967" Apr 20 21:19:41.741698 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:19:41.741676 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"999bd0c9331edc94dad0eeb687b0ea5614471e4c741f9b031b5a5b91de018967\": container with ID starting with 999bd0c9331edc94dad0eeb687b0ea5614471e4c741f9b031b5a5b91de018967 not found: ID does not exist" containerID="999bd0c9331edc94dad0eeb687b0ea5614471e4c741f9b031b5a5b91de018967" Apr 20 21:19:41.741754 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:41.741707 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"999bd0c9331edc94dad0eeb687b0ea5614471e4c741f9b031b5a5b91de018967"} err="failed to get container status \"999bd0c9331edc94dad0eeb687b0ea5614471e4c741f9b031b5a5b91de018967\": rpc error: code = NotFound desc = could not find container \"999bd0c9331edc94dad0eeb687b0ea5614471e4c741f9b031b5a5b91de018967\": container with ID starting with 999bd0c9331edc94dad0eeb687b0ea5614471e4c741f9b031b5a5b91de018967 not found: ID does not exist" Apr 20 21:19:41.751543 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:41.751521 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-w52np"] Apr 20 21:19:41.755057 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:41.755037 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-w52np"] Apr 20 21:19:42.341000 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:42.340966 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82685481-131d-4d3f-82b3-330bdfd87b40" path="/var/lib/kubelet/pods/82685481-131d-4d3f-82b3-330bdfd87b40/volumes" Apr 20 21:19:54.114048 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:54.114007 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s4t89"] Apr 20 21:19:54.114494 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:54.114371 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="82685481-131d-4d3f-82b3-330bdfd87b40" containerName="registry-server" Apr 20 21:19:54.114494 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:54.114384 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="82685481-131d-4d3f-82b3-330bdfd87b40" containerName="registry-server" Apr 20 21:19:54.114494 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:54.114449 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="82685481-131d-4d3f-82b3-330bdfd87b40" containerName="registry-server" Apr 20 21:19:54.119951 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:54.119930 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s4t89" Apr 20 21:19:54.122467 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:54.122442 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 21:19:54.122576 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:54.122497 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 21:19:54.123018 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:54.123005 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-tx77r\"" Apr 20 21:19:54.126612 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:54.126587 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s4t89"] Apr 20 21:19:54.177199 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:54.177141 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl4bw\" (UniqueName: \"kubernetes.io/projected/d712d209-faa7-4304-af59-07b4282072e9-kube-api-access-dl4bw\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s4t89\" (UID: \"d712d209-faa7-4304-af59-07b4282072e9\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s4t89" Apr 20 21:19:54.177385 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:54.177227 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d712d209-faa7-4304-af59-07b4282072e9-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s4t89\" (UID: \"d712d209-faa7-4304-af59-07b4282072e9\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s4t89" Apr 20 21:19:54.177385 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:54.177298 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d712d209-faa7-4304-af59-07b4282072e9-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s4t89\" (UID: \"d712d209-faa7-4304-af59-07b4282072e9\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s4t89" Apr 20 21:19:54.278609 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:54.278576 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d712d209-faa7-4304-af59-07b4282072e9-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s4t89\" (UID: \"d712d209-faa7-4304-af59-07b4282072e9\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s4t89" Apr 20 21:19:54.278819 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:54.278629 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dl4bw\" (UniqueName: \"kubernetes.io/projected/d712d209-faa7-4304-af59-07b4282072e9-kube-api-access-dl4bw\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s4t89\" (UID: \"d712d209-faa7-4304-af59-07b4282072e9\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s4t89" Apr 20 21:19:54.278819 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:54.278760 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d712d209-faa7-4304-af59-07b4282072e9-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s4t89\" (UID: \"d712d209-faa7-4304-af59-07b4282072e9\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s4t89" Apr 20 21:19:54.279049 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:54.279025 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d712d209-faa7-4304-af59-07b4282072e9-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s4t89\" (UID: \"d712d209-faa7-4304-af59-07b4282072e9\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s4t89" Apr 20 21:19:54.279113 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:54.279057 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d712d209-faa7-4304-af59-07b4282072e9-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s4t89\" (UID: \"d712d209-faa7-4304-af59-07b4282072e9\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s4t89" Apr 20 21:19:54.288744 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:54.288709 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl4bw\" (UniqueName: \"kubernetes.io/projected/d712d209-faa7-4304-af59-07b4282072e9-kube-api-access-dl4bw\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s4t89\" (UID: \"d712d209-faa7-4304-af59-07b4282072e9\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s4t89" Apr 20 21:19:54.314368 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:54.314335 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736s58h"] Apr 20 21:19:54.316786 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:54.316769 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736s58h" Apr 20 21:19:54.325527 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:54.325504 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736s58h"] Apr 20 21:19:54.379756 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:54.379675 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njts2\" (UniqueName: \"kubernetes.io/projected/71fe2008-8bc9-49f7-90c6-8caefcc51c20-kube-api-access-njts2\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736s58h\" (UID: \"71fe2008-8bc9-49f7-90c6-8caefcc51c20\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736s58h" Apr 20 21:19:54.379756 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:54.379716 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/71fe2008-8bc9-49f7-90c6-8caefcc51c20-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736s58h\" (UID: \"71fe2008-8bc9-49f7-90c6-8caefcc51c20\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736s58h" Apr 20 21:19:54.379756 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:54.379737 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/71fe2008-8bc9-49f7-90c6-8caefcc51c20-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736s58h\" (UID: \"71fe2008-8bc9-49f7-90c6-8caefcc51c20\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736s58h" Apr 20 21:19:54.430366 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:54.430327 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s4t89" Apr 20 21:19:54.480813 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:54.480780 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-njts2\" (UniqueName: \"kubernetes.io/projected/71fe2008-8bc9-49f7-90c6-8caefcc51c20-kube-api-access-njts2\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736s58h\" (UID: \"71fe2008-8bc9-49f7-90c6-8caefcc51c20\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736s58h" Apr 20 21:19:54.480949 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:54.480828 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/71fe2008-8bc9-49f7-90c6-8caefcc51c20-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736s58h\" (UID: \"71fe2008-8bc9-49f7-90c6-8caefcc51c20\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736s58h" Apr 20 21:19:54.480949 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:54.480846 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/71fe2008-8bc9-49f7-90c6-8caefcc51c20-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736s58h\" (UID: \"71fe2008-8bc9-49f7-90c6-8caefcc51c20\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736s58h" Apr 20 21:19:54.481479 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:54.481142 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/71fe2008-8bc9-49f7-90c6-8caefcc51c20-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736s58h\" (UID: \"71fe2008-8bc9-49f7-90c6-8caefcc51c20\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736s58h" Apr 20 21:19:54.481479 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:54.481469 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/71fe2008-8bc9-49f7-90c6-8caefcc51c20-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736s58h\" (UID: \"71fe2008-8bc9-49f7-90c6-8caefcc51c20\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736s58h" Apr 20 21:19:54.489999 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:54.489951 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-njts2\" (UniqueName: \"kubernetes.io/projected/71fe2008-8bc9-49f7-90c6-8caefcc51c20-kube-api-access-njts2\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736s58h\" (UID: \"71fe2008-8bc9-49f7-90c6-8caefcc51c20\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736s58h" Apr 20 21:19:54.557679 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:54.557659 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s4t89"] Apr 20 21:19:54.560307 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:19:54.560278 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd712d209_faa7_4304_af59_07b4282072e9.slice/crio-e69529333f2c9bccc47d4f78344dd93a41c3d68fa0f2a0e3412676034f01f28f WatchSource:0}: Error finding container e69529333f2c9bccc47d4f78344dd93a41c3d68fa0f2a0e3412676034f01f28f: Status 404 returned error can't find the container with id e69529333f2c9bccc47d4f78344dd93a41c3d68fa0f2a0e3412676034f01f28f Apr 20 21:19:54.626586 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:54.626565 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736s58h" Apr 20 21:19:54.749363 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:54.749332 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736s58h"] Apr 20 21:19:54.751770 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:19:54.751741 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71fe2008_8bc9_49f7_90c6_8caefcc51c20.slice/crio-6c31175bf0700a41d36b77f83eed289af74a453d95a26588bd35e502459e0b6f WatchSource:0}: Error finding container 6c31175bf0700a41d36b77f83eed289af74a453d95a26588bd35e502459e0b6f: Status 404 returned error can't find the container with id 6c31175bf0700a41d36b77f83eed289af74a453d95a26588bd35e502459e0b6f Apr 20 21:19:54.773625 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:54.773602 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736s58h" event={"ID":"71fe2008-8bc9-49f7-90c6-8caefcc51c20","Type":"ContainerStarted","Data":"6c31175bf0700a41d36b77f83eed289af74a453d95a26588bd35e502459e0b6f"} Apr 20 21:19:54.774659 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:54.774627 2571 generic.go:358] "Generic (PLEG): container finished" podID="d712d209-faa7-4304-af59-07b4282072e9" containerID="c44a9db59dbf55efe16cb534497f6476e087bc9d1da2b75575070b15da3a5b3b" exitCode=0 Apr 20 21:19:54.774754 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:54.774708 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s4t89" event={"ID":"d712d209-faa7-4304-af59-07b4282072e9","Type":"ContainerDied","Data":"c44a9db59dbf55efe16cb534497f6476e087bc9d1da2b75575070b15da3a5b3b"} Apr 20 21:19:54.774754 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:54.774733 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s4t89" event={"ID":"d712d209-faa7-4304-af59-07b4282072e9","Type":"ContainerStarted","Data":"e69529333f2c9bccc47d4f78344dd93a41c3d68fa0f2a0e3412676034f01f28f"} Apr 20 21:19:55.512941 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:55.512903 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hwqj4"] Apr 20 21:19:55.515244 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:55.515228 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hwqj4" Apr 20 21:19:55.524636 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:55.524608 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hwqj4"] Apr 20 21:19:55.591192 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:55.591145 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpmlp\" (UniqueName: \"kubernetes.io/projected/ee1835fe-3c55-4dad-a004-b108263d101e-kube-api-access-dpmlp\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hwqj4\" (UID: \"ee1835fe-3c55-4dad-a004-b108263d101e\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hwqj4" Apr 20 21:19:55.591314 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:55.591292 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee1835fe-3c55-4dad-a004-b108263d101e-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hwqj4\" (UID: \"ee1835fe-3c55-4dad-a004-b108263d101e\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hwqj4" Apr 20 21:19:55.591372 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:55.591337 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee1835fe-3c55-4dad-a004-b108263d101e-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hwqj4\" (UID: \"ee1835fe-3c55-4dad-a004-b108263d101e\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hwqj4" Apr 20 21:19:55.691959 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:55.691926 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee1835fe-3c55-4dad-a004-b108263d101e-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hwqj4\" (UID: \"ee1835fe-3c55-4dad-a004-b108263d101e\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hwqj4" Apr 20 21:19:55.692138 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:55.691973 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dpmlp\" (UniqueName: \"kubernetes.io/projected/ee1835fe-3c55-4dad-a004-b108263d101e-kube-api-access-dpmlp\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hwqj4\" (UID: \"ee1835fe-3c55-4dad-a004-b108263d101e\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hwqj4" Apr 20 21:19:55.692138 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:55.692036 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee1835fe-3c55-4dad-a004-b108263d101e-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hwqj4\" (UID: \"ee1835fe-3c55-4dad-a004-b108263d101e\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hwqj4" Apr 20 21:19:55.692384 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:55.692361 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee1835fe-3c55-4dad-a004-b108263d101e-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hwqj4\" (UID: \"ee1835fe-3c55-4dad-a004-b108263d101e\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hwqj4" Apr 20 21:19:55.692447 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:55.692369 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee1835fe-3c55-4dad-a004-b108263d101e-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hwqj4\" (UID: \"ee1835fe-3c55-4dad-a004-b108263d101e\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hwqj4" Apr 20 21:19:55.699746 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:55.699725 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpmlp\" (UniqueName: \"kubernetes.io/projected/ee1835fe-3c55-4dad-a004-b108263d101e-kube-api-access-dpmlp\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hwqj4\" (UID: \"ee1835fe-3c55-4dad-a004-b108263d101e\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hwqj4" Apr 20 21:19:55.780159 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:55.780064 2571 generic.go:358] "Generic (PLEG): container finished" podID="71fe2008-8bc9-49f7-90c6-8caefcc51c20" containerID="11968453f2b1b71c2603f58d05dff98215485476095f844b734b090bbdbebf51" exitCode=0 Apr 20 21:19:55.780353 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:55.780155 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736s58h" event={"ID":"71fe2008-8bc9-49f7-90c6-8caefcc51c20","Type":"ContainerDied","Data":"11968453f2b1b71c2603f58d05dff98215485476095f844b734b090bbdbebf51"} Apr 20 21:19:55.782038 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:55.782016 2571 generic.go:358] "Generic (PLEG): container finished" podID="d712d209-faa7-4304-af59-07b4282072e9" containerID="c7e0568584afe1a62153147f35092f222b5d5d6d24b7bfe1086a02171b5b96ec" exitCode=0 Apr 20 21:19:55.782131 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:55.782053 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s4t89" event={"ID":"d712d209-faa7-4304-af59-07b4282072e9","Type":"ContainerDied","Data":"c7e0568584afe1a62153147f35092f222b5d5d6d24b7bfe1086a02171b5b96ec"} Apr 20 21:19:55.826513 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:55.826491 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hwqj4" Apr 20 21:19:55.961590 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:55.961566 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hwqj4"] Apr 20 21:19:55.980043 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:19:55.979994 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee1835fe_3c55_4dad_a004_b108263d101e.slice/crio-c124350ef73b34e294966fc1fd9d4b15cc37af832501dc0d4f931079c36bf6fc WatchSource:0}: Error finding container c124350ef73b34e294966fc1fd9d4b15cc37af832501dc0d4f931079c36bf6fc: Status 404 returned error can't find the container with id c124350ef73b34e294966fc1fd9d4b15cc37af832501dc0d4f931079c36bf6fc Apr 20 21:19:56.788233 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:56.788155 2571 generic.go:358] "Generic (PLEG): container finished" podID="d712d209-faa7-4304-af59-07b4282072e9" containerID="56fb217d0d075760348aafeb5b91eb22be813094f83ab89fa389e6a20421aceb" exitCode=0 Apr 20 21:19:56.788233 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:56.788220 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s4t89" event={"ID":"d712d209-faa7-4304-af59-07b4282072e9","Type":"ContainerDied","Data":"56fb217d0d075760348aafeb5b91eb22be813094f83ab89fa389e6a20421aceb"} Apr 20 21:19:56.789943 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:56.789920 2571 generic.go:358] "Generic (PLEG): container finished" podID="71fe2008-8bc9-49f7-90c6-8caefcc51c20" containerID="95ea8200149c749696055281fda51888ef3f5c2798865cdf57b2696302ba77bd" exitCode=0 Apr 20 21:19:56.790065 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:56.789954 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736s58h" event={"ID":"71fe2008-8bc9-49f7-90c6-8caefcc51c20","Type":"ContainerDied","Data":"95ea8200149c749696055281fda51888ef3f5c2798865cdf57b2696302ba77bd"} Apr 20 21:19:56.791283 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:56.791263 2571 generic.go:358] "Generic (PLEG): container finished" podID="ee1835fe-3c55-4dad-a004-b108263d101e" containerID="ff0d54f8c89636db0781e5cfe94f4b0b8509633cf3e8b4d23dd48643d7900121" exitCode=0 Apr 20 21:19:56.791367 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:56.791316 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hwqj4" event={"ID":"ee1835fe-3c55-4dad-a004-b108263d101e","Type":"ContainerDied","Data":"ff0d54f8c89636db0781e5cfe94f4b0b8509633cf3e8b4d23dd48643d7900121"} Apr 20 21:19:56.791367 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:56.791334 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hwqj4" event={"ID":"ee1835fe-3c55-4dad-a004-b108263d101e","Type":"ContainerStarted","Data":"c124350ef73b34e294966fc1fd9d4b15cc37af832501dc0d4f931079c36bf6fc"} Apr 20 21:19:57.797115 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:57.797084 2571 generic.go:358] "Generic (PLEG): container finished" podID="71fe2008-8bc9-49f7-90c6-8caefcc51c20" containerID="f9d1690c7a3eb709115925cf982dfadde6a5f423d06b298edf8b0d3625c8e2f7" exitCode=0 Apr 20 21:19:57.797580 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:57.797146 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736s58h" event={"ID":"71fe2008-8bc9-49f7-90c6-8caefcc51c20","Type":"ContainerDied","Data":"f9d1690c7a3eb709115925cf982dfadde6a5f423d06b298edf8b0d3625c8e2f7"} Apr 20 21:19:57.798782 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:57.798759 2571 generic.go:358] "Generic (PLEG): container finished" podID="ee1835fe-3c55-4dad-a004-b108263d101e" containerID="dac54c3cb8a73413f8aa2902c00c61165f2ffd64f53a5440447b5666475141de" exitCode=0 Apr 20 21:19:57.798902 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:57.798840 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hwqj4" event={"ID":"ee1835fe-3c55-4dad-a004-b108263d101e","Type":"ContainerDied","Data":"dac54c3cb8a73413f8aa2902c00c61165f2ffd64f53a5440447b5666475141de"} Apr 20 21:19:57.933413 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:57.933383 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s4t89" Apr 20 21:19:58.012075 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:58.012042 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d712d209-faa7-4304-af59-07b4282072e9-bundle\") pod \"d712d209-faa7-4304-af59-07b4282072e9\" (UID: \"d712d209-faa7-4304-af59-07b4282072e9\") " Apr 20 21:19:58.012265 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:58.012121 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dl4bw\" (UniqueName: \"kubernetes.io/projected/d712d209-faa7-4304-af59-07b4282072e9-kube-api-access-dl4bw\") pod \"d712d209-faa7-4304-af59-07b4282072e9\" (UID: \"d712d209-faa7-4304-af59-07b4282072e9\") " Apr 20 21:19:58.012265 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:58.012201 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d712d209-faa7-4304-af59-07b4282072e9-util\") pod \"d712d209-faa7-4304-af59-07b4282072e9\" (UID: \"d712d209-faa7-4304-af59-07b4282072e9\") " Apr 20 21:19:58.012580 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:58.012551 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d712d209-faa7-4304-af59-07b4282072e9-bundle" (OuterVolumeSpecName: "bundle") pod "d712d209-faa7-4304-af59-07b4282072e9" (UID: "d712d209-faa7-4304-af59-07b4282072e9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:19:58.014285 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:58.014261 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d712d209-faa7-4304-af59-07b4282072e9-kube-api-access-dl4bw" (OuterVolumeSpecName: "kube-api-access-dl4bw") pod "d712d209-faa7-4304-af59-07b4282072e9" (UID: "d712d209-faa7-4304-af59-07b4282072e9"). InnerVolumeSpecName "kube-api-access-dl4bw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:19:58.017533 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:58.017510 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d712d209-faa7-4304-af59-07b4282072e9-util" (OuterVolumeSpecName: "util") pod "d712d209-faa7-4304-af59-07b4282072e9" (UID: "d712d209-faa7-4304-af59-07b4282072e9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:19:58.113406 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:58.113330 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dl4bw\" (UniqueName: \"kubernetes.io/projected/d712d209-faa7-4304-af59-07b4282072e9-kube-api-access-dl4bw\") on node \"ip-10-0-129-149.ec2.internal\" DevicePath \"\"" Apr 20 21:19:58.113406 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:58.113369 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d712d209-faa7-4304-af59-07b4282072e9-util\") on node \"ip-10-0-129-149.ec2.internal\" DevicePath \"\"" Apr 20 21:19:58.113406 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:58.113379 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d712d209-faa7-4304-af59-07b4282072e9-bundle\") on node \"ip-10-0-129-149.ec2.internal\" DevicePath \"\"" Apr 20 21:19:58.804227 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:58.804192 2571 generic.go:358] "Generic (PLEG): container finished" podID="ee1835fe-3c55-4dad-a004-b108263d101e" containerID="65bae7a77c3a4a90261f5d20752979bafdae3fb368067cbe725a07593538242b" exitCode=0 Apr 20 21:19:58.804593 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:58.804282 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hwqj4" event={"ID":"ee1835fe-3c55-4dad-a004-b108263d101e","Type":"ContainerDied","Data":"65bae7a77c3a4a90261f5d20752979bafdae3fb368067cbe725a07593538242b"} Apr 20 21:19:58.805948 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:58.805925 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s4t89" Apr 20 21:19:58.806067 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:58.805969 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s4t89" event={"ID":"d712d209-faa7-4304-af59-07b4282072e9","Type":"ContainerDied","Data":"e69529333f2c9bccc47d4f78344dd93a41c3d68fa0f2a0e3412676034f01f28f"} Apr 20 21:19:58.806067 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:58.805990 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e69529333f2c9bccc47d4f78344dd93a41c3d68fa0f2a0e3412676034f01f28f" Apr 20 21:19:58.931268 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:58.931248 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736s58h" Apr 20 21:19:59.021565 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:59.021534 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njts2\" (UniqueName: \"kubernetes.io/projected/71fe2008-8bc9-49f7-90c6-8caefcc51c20-kube-api-access-njts2\") pod \"71fe2008-8bc9-49f7-90c6-8caefcc51c20\" (UID: \"71fe2008-8bc9-49f7-90c6-8caefcc51c20\") " Apr 20 21:19:59.021711 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:59.021652 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/71fe2008-8bc9-49f7-90c6-8caefcc51c20-bundle\") pod \"71fe2008-8bc9-49f7-90c6-8caefcc51c20\" (UID: \"71fe2008-8bc9-49f7-90c6-8caefcc51c20\") " Apr 20 21:19:59.021711 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:59.021676 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/71fe2008-8bc9-49f7-90c6-8caefcc51c20-util\") pod \"71fe2008-8bc9-49f7-90c6-8caefcc51c20\" (UID: \"71fe2008-8bc9-49f7-90c6-8caefcc51c20\") " Apr 20 21:19:59.022278 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:59.022253 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71fe2008-8bc9-49f7-90c6-8caefcc51c20-bundle" (OuterVolumeSpecName: "bundle") pod "71fe2008-8bc9-49f7-90c6-8caefcc51c20" (UID: "71fe2008-8bc9-49f7-90c6-8caefcc51c20"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:19:59.023809 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:59.023788 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71fe2008-8bc9-49f7-90c6-8caefcc51c20-kube-api-access-njts2" (OuterVolumeSpecName: "kube-api-access-njts2") pod "71fe2008-8bc9-49f7-90c6-8caefcc51c20" (UID: "71fe2008-8bc9-49f7-90c6-8caefcc51c20"). InnerVolumeSpecName "kube-api-access-njts2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:19:59.027107 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:59.027069 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71fe2008-8bc9-49f7-90c6-8caefcc51c20-util" (OuterVolumeSpecName: "util") pod "71fe2008-8bc9-49f7-90c6-8caefcc51c20" (UID: "71fe2008-8bc9-49f7-90c6-8caefcc51c20"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:19:59.122629 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:59.122550 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/71fe2008-8bc9-49f7-90c6-8caefcc51c20-bundle\") on node \"ip-10-0-129-149.ec2.internal\" DevicePath \"\"" Apr 20 21:19:59.122629 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:59.122583 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/71fe2008-8bc9-49f7-90c6-8caefcc51c20-util\") on node \"ip-10-0-129-149.ec2.internal\" DevicePath \"\"" Apr 20 21:19:59.122629 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:59.122592 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-njts2\" (UniqueName: \"kubernetes.io/projected/71fe2008-8bc9-49f7-90c6-8caefcc51c20-kube-api-access-njts2\") on node \"ip-10-0-129-149.ec2.internal\" DevicePath \"\"" Apr 20 21:19:59.811082 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:59.811051 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736s58h" Apr 20 21:19:59.811471 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:59.811050 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736s58h" event={"ID":"71fe2008-8bc9-49f7-90c6-8caefcc51c20","Type":"ContainerDied","Data":"6c31175bf0700a41d36b77f83eed289af74a453d95a26588bd35e502459e0b6f"} Apr 20 21:19:59.811471 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:59.811207 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c31175bf0700a41d36b77f83eed289af74a453d95a26588bd35e502459e0b6f" Apr 20 21:19:59.936771 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:19:59.936749 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hwqj4" Apr 20 21:20:00.028585 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:00.028545 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpmlp\" (UniqueName: \"kubernetes.io/projected/ee1835fe-3c55-4dad-a004-b108263d101e-kube-api-access-dpmlp\") pod \"ee1835fe-3c55-4dad-a004-b108263d101e\" (UID: \"ee1835fe-3c55-4dad-a004-b108263d101e\") " Apr 20 21:20:00.028773 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:00.028620 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee1835fe-3c55-4dad-a004-b108263d101e-bundle\") pod \"ee1835fe-3c55-4dad-a004-b108263d101e\" (UID: \"ee1835fe-3c55-4dad-a004-b108263d101e\") " Apr 20 21:20:00.028773 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:00.028664 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee1835fe-3c55-4dad-a004-b108263d101e-util\") pod \"ee1835fe-3c55-4dad-a004-b108263d101e\" (UID: \"ee1835fe-3c55-4dad-a004-b108263d101e\") " Apr 20 21:20:00.029163 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:00.029122 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee1835fe-3c55-4dad-a004-b108263d101e-bundle" (OuterVolumeSpecName: "bundle") pod "ee1835fe-3c55-4dad-a004-b108263d101e" (UID: "ee1835fe-3c55-4dad-a004-b108263d101e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:20:00.030888 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:00.030863 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee1835fe-3c55-4dad-a004-b108263d101e-kube-api-access-dpmlp" (OuterVolumeSpecName: "kube-api-access-dpmlp") pod "ee1835fe-3c55-4dad-a004-b108263d101e" (UID: "ee1835fe-3c55-4dad-a004-b108263d101e"). InnerVolumeSpecName "kube-api-access-dpmlp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:20:00.033965 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:00.033938 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee1835fe-3c55-4dad-a004-b108263d101e-util" (OuterVolumeSpecName: "util") pod "ee1835fe-3c55-4dad-a004-b108263d101e" (UID: "ee1835fe-3c55-4dad-a004-b108263d101e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:20:00.129475 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:00.129413 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee1835fe-3c55-4dad-a004-b108263d101e-bundle\") on node \"ip-10-0-129-149.ec2.internal\" DevicePath \"\"" Apr 20 21:20:00.129475 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:00.129436 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee1835fe-3c55-4dad-a004-b108263d101e-util\") on node \"ip-10-0-129-149.ec2.internal\" DevicePath \"\"" Apr 20 21:20:00.129475 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:00.129446 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dpmlp\" (UniqueName: \"kubernetes.io/projected/ee1835fe-3c55-4dad-a004-b108263d101e-kube-api-access-dpmlp\") on node \"ip-10-0-129-149.ec2.internal\" DevicePath \"\"" Apr 20 21:20:00.816260 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:00.816208 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hwqj4" event={"ID":"ee1835fe-3c55-4dad-a004-b108263d101e","Type":"ContainerDied","Data":"c124350ef73b34e294966fc1fd9d4b15cc37af832501dc0d4f931079c36bf6fc"} Apr 20 21:20:00.816260 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:00.816268 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c124350ef73b34e294966fc1fd9d4b15cc37af832501dc0d4f931079c36bf6fc" Apr 20 21:20:00.816682 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:00.816244 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hwqj4" Apr 20 21:20:15.596616 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:15.596581 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-5mjpq"] Apr 20 21:20:15.596972 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:15.596888 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee1835fe-3c55-4dad-a004-b108263d101e" containerName="util" Apr 20 21:20:15.596972 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:15.596898 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee1835fe-3c55-4dad-a004-b108263d101e" containerName="util" Apr 20 21:20:15.596972 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:15.596909 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d712d209-faa7-4304-af59-07b4282072e9" containerName="extract" Apr 20 21:20:15.596972 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:15.596915 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d712d209-faa7-4304-af59-07b4282072e9" containerName="extract" Apr 20 21:20:15.596972 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:15.596924 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d712d209-faa7-4304-af59-07b4282072e9" containerName="util" Apr 20 21:20:15.596972 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:15.596931 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d712d209-faa7-4304-af59-07b4282072e9" containerName="util" Apr 20 21:20:15.596972 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:15.596938 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d712d209-faa7-4304-af59-07b4282072e9" containerName="pull" Apr 20 21:20:15.596972 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:15.596943 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d712d209-faa7-4304-af59-07b4282072e9" containerName="pull" Apr 20 21:20:15.596972 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:15.596954 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71fe2008-8bc9-49f7-90c6-8caefcc51c20" containerName="util" Apr 20 21:20:15.596972 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:15.596958 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="71fe2008-8bc9-49f7-90c6-8caefcc51c20" containerName="util" Apr 20 21:20:15.596972 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:15.596967 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee1835fe-3c55-4dad-a004-b108263d101e" containerName="pull" Apr 20 21:20:15.596972 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:15.596972 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee1835fe-3c55-4dad-a004-b108263d101e" containerName="pull" Apr 20 21:20:15.596972 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:15.596979 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee1835fe-3c55-4dad-a004-b108263d101e" containerName="extract" Apr 20 21:20:15.597459 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:15.596984 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee1835fe-3c55-4dad-a004-b108263d101e" containerName="extract" Apr 20 21:20:15.597459 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:15.596992 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71fe2008-8bc9-49f7-90c6-8caefcc51c20" containerName="pull" Apr 20 21:20:15.597459 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:15.596998 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="71fe2008-8bc9-49f7-90c6-8caefcc51c20" containerName="pull" Apr 20 21:20:15.597459 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:15.597003 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71fe2008-8bc9-49f7-90c6-8caefcc51c20" containerName="extract" Apr 20 21:20:15.597459 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:15.597007 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="71fe2008-8bc9-49f7-90c6-8caefcc51c20" containerName="extract" Apr 20 21:20:15.597459 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:15.597053 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="71fe2008-8bc9-49f7-90c6-8caefcc51c20" containerName="extract" Apr 20 21:20:15.597459 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:15.597065 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="ee1835fe-3c55-4dad-a004-b108263d101e" containerName="extract" Apr 20 21:20:15.597459 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:15.597072 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="d712d209-faa7-4304-af59-07b4282072e9" containerName="extract" Apr 20 21:20:15.604824 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:15.604804 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-5mjpq" Apr 20 21:20:15.607365 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:15.607339 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 21:20:15.607649 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:15.607629 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 21:20:15.607850 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:15.607832 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-9nb72\"" Apr 20 21:20:15.607992 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:15.607972 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 20 21:20:15.609222 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:15.609170 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-5mjpq"] Apr 20 21:20:15.656221 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:15.656167 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2whlt\" (UniqueName: \"kubernetes.io/projected/a2b5b080-0f37-4b48-a783-aeaff522e227-kube-api-access-2whlt\") pod \"dns-operator-controller-manager-648d5c98bc-5mjpq\" (UID: \"a2b5b080-0f37-4b48-a783-aeaff522e227\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-5mjpq" Apr 20 21:20:15.757643 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:15.757618 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2whlt\" (UniqueName: \"kubernetes.io/projected/a2b5b080-0f37-4b48-a783-aeaff522e227-kube-api-access-2whlt\") pod \"dns-operator-controller-manager-648d5c98bc-5mjpq\" (UID: \"a2b5b080-0f37-4b48-a783-aeaff522e227\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-5mjpq" Apr 20 21:20:15.765978 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:15.765945 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2whlt\" (UniqueName: \"kubernetes.io/projected/a2b5b080-0f37-4b48-a783-aeaff522e227-kube-api-access-2whlt\") pod \"dns-operator-controller-manager-648d5c98bc-5mjpq\" (UID: \"a2b5b080-0f37-4b48-a783-aeaff522e227\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-5mjpq" Apr 20 21:20:15.916379 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:15.916310 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-5mjpq" Apr 20 21:20:16.037652 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:16.037628 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-5mjpq"] Apr 20 21:20:16.039863 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:20:16.039834 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2b5b080_0f37_4b48_a783_aeaff522e227.slice/crio-44b9f445055914fe07892343cf4c9c7aecfaef1936e0840f730e45d81e0bce94 WatchSource:0}: Error finding container 44b9f445055914fe07892343cf4c9c7aecfaef1936e0840f730e45d81e0bce94: Status 404 returned error can't find the container with id 44b9f445055914fe07892343cf4c9c7aecfaef1936e0840f730e45d81e0bce94 Apr 20 21:20:16.871133 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:16.871097 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-5mjpq" event={"ID":"a2b5b080-0f37-4b48-a783-aeaff522e227","Type":"ContainerStarted","Data":"44b9f445055914fe07892343cf4c9c7aecfaef1936e0840f730e45d81e0bce94"} Apr 20 21:20:18.879422 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:18.879379 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-5mjpq" event={"ID":"a2b5b080-0f37-4b48-a783-aeaff522e227","Type":"ContainerStarted","Data":"d092e5c835bdec08a5bdd6837f813cd9b2bd1fe938a8e1de3ea0f3de3c964cc0"} Apr 20 21:20:18.879860 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:18.879481 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-5mjpq" Apr 20 21:20:18.901734 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:18.901639 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-5mjpq" podStartSLOduration=1.654138229 podStartE2EDuration="3.901624174s" podCreationTimestamp="2026-04-20 21:20:15 +0000 UTC" firstStartedPulling="2026-04-20 21:20:16.041789651 +0000 UTC m=+426.284748750" lastFinishedPulling="2026-04-20 21:20:18.289275583 +0000 UTC m=+428.532234695" observedRunningTime="2026-04-20 21:20:18.900783618 +0000 UTC m=+429.143742738" watchObservedRunningTime="2026-04-20 21:20:18.901624174 +0000 UTC m=+429.144583305" Apr 20 21:20:19.760305 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:19.760267 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-nf6x2"] Apr 20 21:20:19.763751 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:19.763730 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-nf6x2" Apr 20 21:20:19.765774 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:19.765753 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-j9hwt\"" Apr 20 21:20:19.770455 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:19.770431 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-nf6x2"] Apr 20 21:20:19.894043 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:19.894014 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5vhl\" (UniqueName: \"kubernetes.io/projected/2a9d0d39-183b-4fa9-8a3d-0aa95de5d085-kube-api-access-v5vhl\") pod \"limitador-operator-controller-manager-85c4996f8c-nf6x2\" (UID: \"2a9d0d39-183b-4fa9-8a3d-0aa95de5d085\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-nf6x2" Apr 20 21:20:19.994750 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:19.994708 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v5vhl\" (UniqueName: \"kubernetes.io/projected/2a9d0d39-183b-4fa9-8a3d-0aa95de5d085-kube-api-access-v5vhl\") pod \"limitador-operator-controller-manager-85c4996f8c-nf6x2\" (UID: \"2a9d0d39-183b-4fa9-8a3d-0aa95de5d085\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-nf6x2" Apr 20 21:20:20.007299 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:20.007274 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5vhl\" (UniqueName: \"kubernetes.io/projected/2a9d0d39-183b-4fa9-8a3d-0aa95de5d085-kube-api-access-v5vhl\") pod \"limitador-operator-controller-manager-85c4996f8c-nf6x2\" (UID: \"2a9d0d39-183b-4fa9-8a3d-0aa95de5d085\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-nf6x2" Apr 20 21:20:20.074346 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:20.074275 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-nf6x2" Apr 20 21:20:20.206714 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:20.206687 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-nf6x2"] Apr 20 21:20:20.209190 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:20:20.209139 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a9d0d39_183b_4fa9_8a3d_0aa95de5d085.slice/crio-e33ac7c682c1dae6dbe32b7dd334cf52c5af6717e0d57edabbd0b2f6b185d9bf WatchSource:0}: Error finding container e33ac7c682c1dae6dbe32b7dd334cf52c5af6717e0d57edabbd0b2f6b185d9bf: Status 404 returned error can't find the container with id e33ac7c682c1dae6dbe32b7dd334cf52c5af6717e0d57edabbd0b2f6b185d9bf Apr 20 21:20:20.887753 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:20.887720 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-nf6x2" event={"ID":"2a9d0d39-183b-4fa9-8a3d-0aa95de5d085","Type":"ContainerStarted","Data":"e33ac7c682c1dae6dbe32b7dd334cf52c5af6717e0d57edabbd0b2f6b185d9bf"} Apr 20 21:20:22.896857 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:22.896814 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-nf6x2" event={"ID":"2a9d0d39-183b-4fa9-8a3d-0aa95de5d085","Type":"ContainerStarted","Data":"149e8b8b50695a235d60c17f58b437e02af1eddd24d48ddc17669b19ffcd88b9"} Apr 20 21:20:22.897246 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:22.896977 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-nf6x2" Apr 20 21:20:22.912823 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:22.912762 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-nf6x2" podStartSLOduration=2.117659733 podStartE2EDuration="3.912746781s" podCreationTimestamp="2026-04-20 21:20:19 +0000 UTC" firstStartedPulling="2026-04-20 21:20:20.211234938 +0000 UTC m=+430.454194037" lastFinishedPulling="2026-04-20 21:20:22.006321984 +0000 UTC m=+432.249281085" observedRunningTime="2026-04-20 21:20:22.910601106 +0000 UTC m=+433.153560220" watchObservedRunningTime="2026-04-20 21:20:22.912746781 +0000 UTC m=+433.155705897" Apr 20 21:20:29.885123 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:29.885045 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-5mjpq" Apr 20 21:20:33.902535 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:33.902505 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-nf6x2" Apr 20 21:20:35.564108 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:35.564078 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-476zv"] Apr 20 21:20:35.568937 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:35.568917 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-476zv" Apr 20 21:20:35.570966 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:35.570943 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 20 21:20:35.571093 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:35.571014 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 20 21:20:35.571093 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:35.571019 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-tx77r\"" Apr 20 21:20:35.580224 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:35.580201 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-476zv"] Apr 20 21:20:35.726756 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:35.726720 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/aef360e0-704a-4637-997b-87e74a9488bc-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-476zv\" (UID: \"aef360e0-704a-4637-997b-87e74a9488bc\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-476zv" Apr 20 21:20:35.726923 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:35.726801 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/aef360e0-704a-4637-997b-87e74a9488bc-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-476zv\" (UID: \"aef360e0-704a-4637-997b-87e74a9488bc\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-476zv" Apr 20 21:20:35.726923 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:35.726845 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tdzh\" (UniqueName: \"kubernetes.io/projected/aef360e0-704a-4637-997b-87e74a9488bc-kube-api-access-5tdzh\") pod \"kuadrant-console-plugin-6cb54b5c86-476zv\" (UID: \"aef360e0-704a-4637-997b-87e74a9488bc\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-476zv" Apr 20 21:20:35.827322 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:35.827238 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5tdzh\" (UniqueName: \"kubernetes.io/projected/aef360e0-704a-4637-997b-87e74a9488bc-kube-api-access-5tdzh\") pod \"kuadrant-console-plugin-6cb54b5c86-476zv\" (UID: \"aef360e0-704a-4637-997b-87e74a9488bc\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-476zv" Apr 20 21:20:35.827322 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:35.827297 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/aef360e0-704a-4637-997b-87e74a9488bc-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-476zv\" (UID: \"aef360e0-704a-4637-997b-87e74a9488bc\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-476zv" Apr 20 21:20:35.827527 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:35.827350 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/aef360e0-704a-4637-997b-87e74a9488bc-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-476zv\" (UID: \"aef360e0-704a-4637-997b-87e74a9488bc\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-476zv" Apr 20 21:20:35.827973 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:35.827948 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/aef360e0-704a-4637-997b-87e74a9488bc-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-476zv\" (UID: \"aef360e0-704a-4637-997b-87e74a9488bc\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-476zv" Apr 20 21:20:35.829839 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:35.829821 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/aef360e0-704a-4637-997b-87e74a9488bc-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-476zv\" (UID: \"aef360e0-704a-4637-997b-87e74a9488bc\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-476zv" Apr 20 21:20:35.835291 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:35.835268 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tdzh\" (UniqueName: \"kubernetes.io/projected/aef360e0-704a-4637-997b-87e74a9488bc-kube-api-access-5tdzh\") pod \"kuadrant-console-plugin-6cb54b5c86-476zv\" (UID: \"aef360e0-704a-4637-997b-87e74a9488bc\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-476zv" Apr 20 21:20:35.878902 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:35.878879 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-476zv" Apr 20 21:20:36.005012 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:36.004989 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-476zv"] Apr 20 21:20:36.007084 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:20:36.007054 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaef360e0_704a_4637_997b_87e74a9488bc.slice/crio-20652882f87c192fb6f4d43091c4a65df86d3b75be1cce341d2f733321f081d6 WatchSource:0}: Error finding container 20652882f87c192fb6f4d43091c4a65df86d3b75be1cce341d2f733321f081d6: Status 404 returned error can't find the container with id 20652882f87c192fb6f4d43091c4a65df86d3b75be1cce341d2f733321f081d6 Apr 20 21:20:36.951597 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:36.951559 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-476zv" event={"ID":"aef360e0-704a-4637-997b-87e74a9488bc","Type":"ContainerStarted","Data":"20652882f87c192fb6f4d43091c4a65df86d3b75be1cce341d2f733321f081d6"} Apr 20 21:20:46.365397 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:46.365363 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-nf6x2"] Apr 20 21:20:46.365881 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:46.365643 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-nf6x2" podUID="2a9d0d39-183b-4fa9-8a3d-0aa95de5d085" containerName="manager" containerID="cri-o://149e8b8b50695a235d60c17f58b437e02af1eddd24d48ddc17669b19ffcd88b9" gracePeriod=2 Apr 20 21:20:46.385039 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:46.385011 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-nf6x2"] Apr 20 21:20:46.405362 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:46.405043 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sbtxk"] Apr 20 21:20:46.405923 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:46.405900 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2a9d0d39-183b-4fa9-8a3d-0aa95de5d085" containerName="manager" Apr 20 21:20:46.406020 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:46.405927 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a9d0d39-183b-4fa9-8a3d-0aa95de5d085" containerName="manager" Apr 20 21:20:46.406074 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:46.406025 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="2a9d0d39-183b-4fa9-8a3d-0aa95de5d085" containerName="manager" Apr 20 21:20:46.409044 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:46.409025 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sbtxk" Apr 20 21:20:46.415328 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:46.415269 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cw9c\" (UniqueName: \"kubernetes.io/projected/a84abe7e-a70e-49c4-8943-79c66da4e3a0-kube-api-access-7cw9c\") pod \"limitador-operator-controller-manager-85c4996f8c-sbtxk\" (UID: \"a84abe7e-a70e-49c4-8943-79c66da4e3a0\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sbtxk" Apr 20 21:20:46.422543 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:46.422501 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sbtxk"] Apr 20 21:20:46.515983 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:46.515941 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7cw9c\" (UniqueName: \"kubernetes.io/projected/a84abe7e-a70e-49c4-8943-79c66da4e3a0-kube-api-access-7cw9c\") pod \"limitador-operator-controller-manager-85c4996f8c-sbtxk\" (UID: \"a84abe7e-a70e-49c4-8943-79c66da4e3a0\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sbtxk" Apr 20 21:20:46.525771 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:46.525735 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cw9c\" (UniqueName: \"kubernetes.io/projected/a84abe7e-a70e-49c4-8943-79c66da4e3a0-kube-api-access-7cw9c\") pod \"limitador-operator-controller-manager-85c4996f8c-sbtxk\" (UID: \"a84abe7e-a70e-49c4-8943-79c66da4e3a0\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sbtxk" Apr 20 21:20:46.735957 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:20:46.735922 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sbtxk" Apr 20 21:21:00.200159 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:00.200096 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sbtxk"] Apr 20 21:21:00.202231 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:21:00.202203 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda84abe7e_a70e_49c4_8943_79c66da4e3a0.slice/crio-12b116cecdcbe6affff1abb00fb2aedc32861684496df2254675ce879f0f13be WatchSource:0}: Error finding container 12b116cecdcbe6affff1abb00fb2aedc32861684496df2254675ce879f0f13be: Status 404 returned error can't find the container with id 12b116cecdcbe6affff1abb00fb2aedc32861684496df2254675ce879f0f13be Apr 20 21:21:00.205239 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:00.205164 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-nf6x2" Apr 20 21:21:00.207439 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:00.207406 2571 status_manager.go:895] "Failed to get status for pod" podUID="2a9d0d39-183b-4fa9-8a3d-0aa95de5d085" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-nf6x2" err="pods \"limitador-operator-controller-manager-85c4996f8c-nf6x2\" is forbidden: User \"system:node:ip-10-0-129-149.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-149.ec2.internal' and this object" Apr 20 21:21:00.247261 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:00.247239 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5vhl\" (UniqueName: \"kubernetes.io/projected/2a9d0d39-183b-4fa9-8a3d-0aa95de5d085-kube-api-access-v5vhl\") pod \"2a9d0d39-183b-4fa9-8a3d-0aa95de5d085\" (UID: \"2a9d0d39-183b-4fa9-8a3d-0aa95de5d085\") " Apr 20 21:21:00.249399 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:00.249374 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a9d0d39-183b-4fa9-8a3d-0aa95de5d085-kube-api-access-v5vhl" (OuterVolumeSpecName: "kube-api-access-v5vhl") pod "2a9d0d39-183b-4fa9-8a3d-0aa95de5d085" (UID: "2a9d0d39-183b-4fa9-8a3d-0aa95de5d085"). InnerVolumeSpecName "kube-api-access-v5vhl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:21:00.341743 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:00.341677 2571 status_manager.go:895] "Failed to get status for pod" podUID="2a9d0d39-183b-4fa9-8a3d-0aa95de5d085" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-nf6x2" err="pods \"limitador-operator-controller-manager-85c4996f8c-nf6x2\" is forbidden: User \"system:node:ip-10-0-129-149.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-149.ec2.internal' and this object" Apr 20 21:21:00.341877 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:00.341859 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a9d0d39-183b-4fa9-8a3d-0aa95de5d085" path="/var/lib/kubelet/pods/2a9d0d39-183b-4fa9-8a3d-0aa95de5d085/volumes" Apr 20 21:21:00.348315 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:00.348286 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v5vhl\" (UniqueName: \"kubernetes.io/projected/2a9d0d39-183b-4fa9-8a3d-0aa95de5d085-kube-api-access-v5vhl\") on node \"ip-10-0-129-149.ec2.internal\" DevicePath \"\"" Apr 20 21:21:01.053304 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:01.053272 2571 generic.go:358] "Generic (PLEG): container finished" podID="2a9d0d39-183b-4fa9-8a3d-0aa95de5d085" containerID="149e8b8b50695a235d60c17f58b437e02af1eddd24d48ddc17669b19ffcd88b9" exitCode=0 Apr 20 21:21:01.053566 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:01.053316 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-nf6x2" Apr 20 21:21:01.053566 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:01.053375 2571 scope.go:117] "RemoveContainer" containerID="149e8b8b50695a235d60c17f58b437e02af1eddd24d48ddc17669b19ffcd88b9" Apr 20 21:21:01.054999 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:01.054974 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-476zv" event={"ID":"aef360e0-704a-4637-997b-87e74a9488bc","Type":"ContainerStarted","Data":"e6abdd51de5b35435e2ea5ead7a7276d8502e5b37babf35093e78f2ef00bba6a"} Apr 20 21:21:01.055259 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:01.055232 2571 status_manager.go:895] "Failed to get status for pod" podUID="2a9d0d39-183b-4fa9-8a3d-0aa95de5d085" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-nf6x2" err="pods \"limitador-operator-controller-manager-85c4996f8c-nf6x2\" is forbidden: User \"system:node:ip-10-0-129-149.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-149.ec2.internal' and this object" Apr 20 21:21:01.056751 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:01.056720 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sbtxk" event={"ID":"a84abe7e-a70e-49c4-8943-79c66da4e3a0","Type":"ContainerStarted","Data":"3ae2cd3a0d425ec12f38e049ef89425f7ff76d4aeb9551613c7e5c7708255468"} Apr 20 21:21:01.056845 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:01.056750 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sbtxk" event={"ID":"a84abe7e-a70e-49c4-8943-79c66da4e3a0","Type":"ContainerStarted","Data":"12b116cecdcbe6affff1abb00fb2aedc32861684496df2254675ce879f0f13be"} Apr 20 21:21:01.056890 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:01.056859 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sbtxk" Apr 20 21:21:01.056924 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:01.056891 2571 status_manager.go:895] "Failed to get status for pod" podUID="2a9d0d39-183b-4fa9-8a3d-0aa95de5d085" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-nf6x2" err="pods \"limitador-operator-controller-manager-85c4996f8c-nf6x2\" is forbidden: User \"system:node:ip-10-0-129-149.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-149.ec2.internal' and this object" Apr 20 21:21:01.062096 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:01.062081 2571 scope.go:117] "RemoveContainer" containerID="149e8b8b50695a235d60c17f58b437e02af1eddd24d48ddc17669b19ffcd88b9" Apr 20 21:21:01.062350 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:21:01.062331 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"149e8b8b50695a235d60c17f58b437e02af1eddd24d48ddc17669b19ffcd88b9\": container with ID starting with 149e8b8b50695a235d60c17f58b437e02af1eddd24d48ddc17669b19ffcd88b9 not found: ID does not exist" containerID="149e8b8b50695a235d60c17f58b437e02af1eddd24d48ddc17669b19ffcd88b9" Apr 20 21:21:01.062430 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:01.062373 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"149e8b8b50695a235d60c17f58b437e02af1eddd24d48ddc17669b19ffcd88b9"} err="failed to get container status \"149e8b8b50695a235d60c17f58b437e02af1eddd24d48ddc17669b19ffcd88b9\": rpc error: code = NotFound desc = could not find container \"149e8b8b50695a235d60c17f58b437e02af1eddd24d48ddc17669b19ffcd88b9\": container with ID starting with 149e8b8b50695a235d60c17f58b437e02af1eddd24d48ddc17669b19ffcd88b9 not found: ID does not exist" Apr 20 21:21:01.072068 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:01.072039 2571 status_manager.go:895] "Failed to get status for pod" podUID="2a9d0d39-183b-4fa9-8a3d-0aa95de5d085" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-nf6x2" err="pods \"limitador-operator-controller-manager-85c4996f8c-nf6x2\" is forbidden: User \"system:node:ip-10-0-129-149.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-149.ec2.internal' and this object" Apr 20 21:21:01.072692 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:01.072650 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-476zv" podStartSLOduration=1.9442198689999999 podStartE2EDuration="26.072639328s" podCreationTimestamp="2026-04-20 21:20:35 +0000 UTC" firstStartedPulling="2026-04-20 21:20:36.00839396 +0000 UTC m=+446.251353058" lastFinishedPulling="2026-04-20 21:21:00.136813415 +0000 UTC m=+470.379772517" observedRunningTime="2026-04-20 21:21:01.070260623 +0000 UTC m=+471.313219751" watchObservedRunningTime="2026-04-20 21:21:01.072639328 +0000 UTC m=+471.315598448" Apr 20 21:21:01.086298 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:01.086250 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sbtxk" podStartSLOduration=15.086236299 podStartE2EDuration="15.086236299s" podCreationTimestamp="2026-04-20 21:20:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:21:01.085706297 +0000 UTC m=+471.328665417" watchObservedRunningTime="2026-04-20 21:21:01.086236299 +0000 UTC m=+471.329195420" Apr 20 21:21:12.063975 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:12.063942 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sbtxk" Apr 20 21:21:17.759454 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:17.759411 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6"] Apr 20 21:21:17.880085 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:17.880050 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6"] Apr 20 21:21:17.880248 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:17.880194 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6" Apr 20 21:21:17.882508 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:17.882485 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-7m4fp\"" Apr 20 21:21:17.994888 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:17.994856 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/44bb23ed-dd9f-450b-b0ea-db4715d86492-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-sqlf6\" (UID: \"44bb23ed-dd9f-450b-b0ea-db4715d86492\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6" Apr 20 21:21:17.995065 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:17.994907 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/44bb23ed-dd9f-450b-b0ea-db4715d86492-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-sqlf6\" (UID: \"44bb23ed-dd9f-450b-b0ea-db4715d86492\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6" Apr 20 21:21:17.995065 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:17.994926 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/44bb23ed-dd9f-450b-b0ea-db4715d86492-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-sqlf6\" (UID: \"44bb23ed-dd9f-450b-b0ea-db4715d86492\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6" Apr 20 21:21:17.995065 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:17.994973 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/44bb23ed-dd9f-450b-b0ea-db4715d86492-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-sqlf6\" (UID: \"44bb23ed-dd9f-450b-b0ea-db4715d86492\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6" Apr 20 21:21:17.995065 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:17.995021 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/44bb23ed-dd9f-450b-b0ea-db4715d86492-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-sqlf6\" (UID: \"44bb23ed-dd9f-450b-b0ea-db4715d86492\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6" Apr 20 21:21:17.995323 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:17.995108 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/44bb23ed-dd9f-450b-b0ea-db4715d86492-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-sqlf6\" (UID: \"44bb23ed-dd9f-450b-b0ea-db4715d86492\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6" Apr 20 21:21:17.995323 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:17.995139 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/44bb23ed-dd9f-450b-b0ea-db4715d86492-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-sqlf6\" (UID: \"44bb23ed-dd9f-450b-b0ea-db4715d86492\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6" Apr 20 21:21:17.995323 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:17.995207 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbgn2\" (UniqueName: \"kubernetes.io/projected/44bb23ed-dd9f-450b-b0ea-db4715d86492-kube-api-access-gbgn2\") pod \"maas-default-gateway-openshift-default-58b6f876-sqlf6\" (UID: \"44bb23ed-dd9f-450b-b0ea-db4715d86492\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6" Apr 20 21:21:17.995323 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:17.995253 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/44bb23ed-dd9f-450b-b0ea-db4715d86492-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-sqlf6\" (UID: \"44bb23ed-dd9f-450b-b0ea-db4715d86492\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6" Apr 20 21:21:18.096409 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:18.096331 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/44bb23ed-dd9f-450b-b0ea-db4715d86492-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-sqlf6\" (UID: \"44bb23ed-dd9f-450b-b0ea-db4715d86492\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6" Apr 20 21:21:18.096409 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:18.096369 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/44bb23ed-dd9f-450b-b0ea-db4715d86492-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-sqlf6\" (UID: \"44bb23ed-dd9f-450b-b0ea-db4715d86492\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6" Apr 20 21:21:18.096409 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:18.096394 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gbgn2\" (UniqueName: \"kubernetes.io/projected/44bb23ed-dd9f-450b-b0ea-db4715d86492-kube-api-access-gbgn2\") pod \"maas-default-gateway-openshift-default-58b6f876-sqlf6\" (UID: \"44bb23ed-dd9f-450b-b0ea-db4715d86492\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6" Apr 20 21:21:18.096691 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:18.096522 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/44bb23ed-dd9f-450b-b0ea-db4715d86492-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-sqlf6\" (UID: \"44bb23ed-dd9f-450b-b0ea-db4715d86492\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6" Apr 20 21:21:18.096691 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:18.096578 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/44bb23ed-dd9f-450b-b0ea-db4715d86492-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-sqlf6\" (UID: \"44bb23ed-dd9f-450b-b0ea-db4715d86492\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6" Apr 20 21:21:18.096691 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:18.096633 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/44bb23ed-dd9f-450b-b0ea-db4715d86492-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-sqlf6\" (UID: \"44bb23ed-dd9f-450b-b0ea-db4715d86492\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6" Apr 20 21:21:18.096691 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:18.096661 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/44bb23ed-dd9f-450b-b0ea-db4715d86492-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-sqlf6\" (UID: \"44bb23ed-dd9f-450b-b0ea-db4715d86492\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6" Apr 20 21:21:18.096691 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:18.096690 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/44bb23ed-dd9f-450b-b0ea-db4715d86492-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-sqlf6\" (UID: \"44bb23ed-dd9f-450b-b0ea-db4715d86492\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6" Apr 20 21:21:18.096954 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:18.096722 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/44bb23ed-dd9f-450b-b0ea-db4715d86492-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-sqlf6\" (UID: \"44bb23ed-dd9f-450b-b0ea-db4715d86492\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6" Apr 20 21:21:18.096954 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:18.096797 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/44bb23ed-dd9f-450b-b0ea-db4715d86492-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-sqlf6\" (UID: \"44bb23ed-dd9f-450b-b0ea-db4715d86492\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6" Apr 20 21:21:18.096954 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:18.096925 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/44bb23ed-dd9f-450b-b0ea-db4715d86492-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-sqlf6\" (UID: \"44bb23ed-dd9f-450b-b0ea-db4715d86492\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6" Apr 20 21:21:18.097130 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:18.097043 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/44bb23ed-dd9f-450b-b0ea-db4715d86492-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-sqlf6\" (UID: \"44bb23ed-dd9f-450b-b0ea-db4715d86492\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6" Apr 20 21:21:18.097130 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:18.097060 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/44bb23ed-dd9f-450b-b0ea-db4715d86492-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-sqlf6\" (UID: \"44bb23ed-dd9f-450b-b0ea-db4715d86492\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6" Apr 20 21:21:18.097341 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:18.097319 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/44bb23ed-dd9f-450b-b0ea-db4715d86492-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-sqlf6\" (UID: \"44bb23ed-dd9f-450b-b0ea-db4715d86492\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6" Apr 20 21:21:18.099068 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:18.099051 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/44bb23ed-dd9f-450b-b0ea-db4715d86492-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-sqlf6\" (UID: \"44bb23ed-dd9f-450b-b0ea-db4715d86492\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6" Apr 20 21:21:18.099353 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:18.099337 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/44bb23ed-dd9f-450b-b0ea-db4715d86492-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-sqlf6\" (UID: \"44bb23ed-dd9f-450b-b0ea-db4715d86492\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6" Apr 20 21:21:18.103519 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:18.103493 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/44bb23ed-dd9f-450b-b0ea-db4715d86492-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-sqlf6\" (UID: \"44bb23ed-dd9f-450b-b0ea-db4715d86492\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6" Apr 20 21:21:18.103707 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:18.103690 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbgn2\" (UniqueName: \"kubernetes.io/projected/44bb23ed-dd9f-450b-b0ea-db4715d86492-kube-api-access-gbgn2\") pod \"maas-default-gateway-openshift-default-58b6f876-sqlf6\" (UID: \"44bb23ed-dd9f-450b-b0ea-db4715d86492\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6" Apr 20 21:21:18.190983 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:18.190954 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6" Apr 20 21:21:18.322518 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:18.322493 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6"] Apr 20 21:21:18.324714 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:21:18.324688 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44bb23ed_dd9f_450b_b0ea_db4715d86492.slice/crio-6b22ae47a31694339fb9bd77166cbf058c0fee460faeb86dd358b347d59804d9 WatchSource:0}: Error finding container 6b22ae47a31694339fb9bd77166cbf058c0fee460faeb86dd358b347d59804d9: Status 404 returned error can't find the container with id 6b22ae47a31694339fb9bd77166cbf058c0fee460faeb86dd358b347d59804d9 Apr 20 21:21:18.327088 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:18.327047 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 20 21:21:18.327193 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:18.327135 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 20 21:21:18.327240 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:18.327209 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 20 21:21:19.123064 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:19.123030 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6" event={"ID":"44bb23ed-dd9f-450b-b0ea-db4715d86492","Type":"ContainerStarted","Data":"1fbc0b76be2fcd6df1f96fd4b9172dc474c1d23154971a90ada4918e794f3845"} Apr 20 21:21:19.123064 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:19.123068 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6" event={"ID":"44bb23ed-dd9f-450b-b0ea-db4715d86492","Type":"ContainerStarted","Data":"6b22ae47a31694339fb9bd77166cbf058c0fee460faeb86dd358b347d59804d9"} Apr 20 21:21:19.146548 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:19.146503 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6" podStartSLOduration=2.146489284 podStartE2EDuration="2.146489284s" podCreationTimestamp="2026-04-20 21:21:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:21:19.144944497 +0000 UTC m=+489.387903617" watchObservedRunningTime="2026-04-20 21:21:19.146489284 +0000 UTC m=+489.389448404" Apr 20 21:21:19.192040 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:19.192014 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6" Apr 20 21:21:19.196904 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:19.196882 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6" Apr 20 21:21:20.126966 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:20.126941 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6" Apr 20 21:21:20.128054 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:20.128031 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-sqlf6" Apr 20 21:21:31.616126 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:31.616085 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-s72vj"] Apr 20 21:21:31.707928 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:31.707891 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-s72vj"] Apr 20 21:21:31.708083 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:31.707955 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-s72vj" Apr 20 21:21:31.710189 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:31.710158 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-pcgm8\"" Apr 20 21:21:31.814839 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:31.814807 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drv9t\" (UniqueName: \"kubernetes.io/projected/f1723fae-a935-4c50-847c-e61ed8770714-kube-api-access-drv9t\") pod \"authorino-7498df8756-s72vj\" (UID: \"f1723fae-a935-4c50-847c-e61ed8770714\") " pod="kuadrant-system/authorino-7498df8756-s72vj" Apr 20 21:21:31.916089 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:31.916019 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-drv9t\" (UniqueName: \"kubernetes.io/projected/f1723fae-a935-4c50-847c-e61ed8770714-kube-api-access-drv9t\") pod \"authorino-7498df8756-s72vj\" (UID: \"f1723fae-a935-4c50-847c-e61ed8770714\") " pod="kuadrant-system/authorino-7498df8756-s72vj" Apr 20 21:21:31.922841 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:31.922815 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-drv9t\" (UniqueName: \"kubernetes.io/projected/f1723fae-a935-4c50-847c-e61ed8770714-kube-api-access-drv9t\") pod \"authorino-7498df8756-s72vj\" (UID: \"f1723fae-a935-4c50-847c-e61ed8770714\") " pod="kuadrant-system/authorino-7498df8756-s72vj" Apr 20 21:21:32.017125 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:32.017075 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-s72vj" Apr 20 21:21:32.141090 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:32.141062 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-s72vj"] Apr 20 21:21:32.143600 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:21:32.143567 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1723fae_a935_4c50_847c_e61ed8770714.slice/crio-5d4814e6743640ca6506316cf0d59bf2916cd67ea0a77d00738c75a5db56a70d WatchSource:0}: Error finding container 5d4814e6743640ca6506316cf0d59bf2916cd67ea0a77d00738c75a5db56a70d: Status 404 returned error can't find the container with id 5d4814e6743640ca6506316cf0d59bf2916cd67ea0a77d00738c75a5db56a70d Apr 20 21:21:32.183843 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:32.183771 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-s72vj" event={"ID":"f1723fae-a935-4c50-847c-e61ed8770714","Type":"ContainerStarted","Data":"5d4814e6743640ca6506316cf0d59bf2916cd67ea0a77d00738c75a5db56a70d"} Apr 20 21:21:35.198341 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:35.198305 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-s72vj" event={"ID":"f1723fae-a935-4c50-847c-e61ed8770714","Type":"ContainerStarted","Data":"02c8862ea4dd36fe74f1cfa4abf4e2fe4bde60b00abda43c40912bdc7d775162"} Apr 20 21:21:35.214726 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:21:35.214672 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-s72vj" podStartSLOduration=1.252445533 podStartE2EDuration="4.214655355s" podCreationTimestamp="2026-04-20 21:21:31 +0000 UTC" firstStartedPulling="2026-04-20 21:21:32.144875249 +0000 UTC m=+502.387834347" lastFinishedPulling="2026-04-20 21:21:35.107085067 +0000 UTC m=+505.350044169" observedRunningTime="2026-04-20 21:21:35.211284826 +0000 UTC m=+505.454243946" watchObservedRunningTime="2026-04-20 21:21:35.214655355 +0000 UTC m=+505.457614474" Apr 20 21:22:00.528511 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:00.528437 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-mtrkd"] Apr 20 21:22:00.532006 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:00.531988 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-mtrkd" Apr 20 21:22:00.538610 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:00.538581 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-mtrkd"] Apr 20 21:22:00.565225 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:00.565172 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c9g2\" (UniqueName: \"kubernetes.io/projected/d2b525ef-b582-4d5a-80a9-01682810d764-kube-api-access-6c9g2\") pod \"authorino-8b475cf9f-mtrkd\" (UID: \"d2b525ef-b582-4d5a-80a9-01682810d764\") " pod="kuadrant-system/authorino-8b475cf9f-mtrkd" Apr 20 21:22:00.665716 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:00.665677 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6c9g2\" (UniqueName: \"kubernetes.io/projected/d2b525ef-b582-4d5a-80a9-01682810d764-kube-api-access-6c9g2\") pod \"authorino-8b475cf9f-mtrkd\" (UID: \"d2b525ef-b582-4d5a-80a9-01682810d764\") " pod="kuadrant-system/authorino-8b475cf9f-mtrkd" Apr 20 21:22:00.673573 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:00.673549 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c9g2\" (UniqueName: \"kubernetes.io/projected/d2b525ef-b582-4d5a-80a9-01682810d764-kube-api-access-6c9g2\") pod \"authorino-8b475cf9f-mtrkd\" (UID: \"d2b525ef-b582-4d5a-80a9-01682810d764\") " pod="kuadrant-system/authorino-8b475cf9f-mtrkd" Apr 20 21:22:00.738092 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:00.738059 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-mtrkd"] Apr 20 21:22:00.738329 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:00.738317 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-mtrkd" Apr 20 21:22:00.763876 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:00.763842 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-6c6bbc787f-6dnr4"] Apr 20 21:22:00.768299 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:00.768277 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6c6bbc787f-6dnr4" Apr 20 21:22:00.773594 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:00.773569 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-6c6bbc787f-6dnr4"] Apr 20 21:22:00.867578 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:00.867549 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcv2r\" (UniqueName: \"kubernetes.io/projected/ba6543b2-957c-489c-8cbc-8edea2a8c16d-kube-api-access-pcv2r\") pod \"authorino-6c6bbc787f-6dnr4\" (UID: \"ba6543b2-957c-489c-8cbc-8edea2a8c16d\") " pod="kuadrant-system/authorino-6c6bbc787f-6dnr4" Apr 20 21:22:00.870667 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:00.870644 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-mtrkd"] Apr 20 21:22:00.873227 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:22:00.873198 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2b525ef_b582_4d5a_80a9_01682810d764.slice/crio-514d300475a48478f834d32375a785bde8f27fd80ebd0079430b95a4ffbdc789 WatchSource:0}: Error finding container 514d300475a48478f834d32375a785bde8f27fd80ebd0079430b95a4ffbdc789: Status 404 returned error can't find the container with id 514d300475a48478f834d32375a785bde8f27fd80ebd0079430b95a4ffbdc789 Apr 20 21:22:00.968786 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:00.968752 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pcv2r\" (UniqueName: \"kubernetes.io/projected/ba6543b2-957c-489c-8cbc-8edea2a8c16d-kube-api-access-pcv2r\") pod \"authorino-6c6bbc787f-6dnr4\" (UID: \"ba6543b2-957c-489c-8cbc-8edea2a8c16d\") " pod="kuadrant-system/authorino-6c6bbc787f-6dnr4" Apr 20 21:22:00.977152 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:00.977117 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcv2r\" (UniqueName: \"kubernetes.io/projected/ba6543b2-957c-489c-8cbc-8edea2a8c16d-kube-api-access-pcv2r\") pod \"authorino-6c6bbc787f-6dnr4\" (UID: \"ba6543b2-957c-489c-8cbc-8edea2a8c16d\") " pod="kuadrant-system/authorino-6c6bbc787f-6dnr4" Apr 20 21:22:01.036772 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:01.036728 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-6c6bbc787f-6dnr4"] Apr 20 21:22:01.037029 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:01.037013 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6c6bbc787f-6dnr4" Apr 20 21:22:01.065101 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:01.065070 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-85ff7bd8dc-p2z8j"] Apr 20 21:22:01.069943 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:01.069917 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-85ff7bd8dc-p2z8j" Apr 20 21:22:01.072096 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:01.072072 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 20 21:22:01.075706 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:01.075682 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-85ff7bd8dc-p2z8j"] Apr 20 21:22:01.168360 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:01.168335 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-6c6bbc787f-6dnr4"] Apr 20 21:22:01.170850 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:01.170817 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/0017fa53-28a3-4fbd-a39c-9a40c5def571-tls-cert\") pod \"authorino-85ff7bd8dc-p2z8j\" (UID: \"0017fa53-28a3-4fbd-a39c-9a40c5def571\") " pod="kuadrant-system/authorino-85ff7bd8dc-p2z8j" Apr 20 21:22:01.170850 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:22:01.170835 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba6543b2_957c_489c_8cbc_8edea2a8c16d.slice/crio-bed68fa226667afbe41af946097f75267e4cc78a4cd3b08bf4433b2d3bdbcd09 WatchSource:0}: Error finding container bed68fa226667afbe41af946097f75267e4cc78a4cd3b08bf4433b2d3bdbcd09: Status 404 returned error can't find the container with id bed68fa226667afbe41af946097f75267e4cc78a4cd3b08bf4433b2d3bdbcd09 Apr 20 21:22:01.171036 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:01.170952 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgfxq\" (UniqueName: \"kubernetes.io/projected/0017fa53-28a3-4fbd-a39c-9a40c5def571-kube-api-access-vgfxq\") pod \"authorino-85ff7bd8dc-p2z8j\" (UID: \"0017fa53-28a3-4fbd-a39c-9a40c5def571\") " pod="kuadrant-system/authorino-85ff7bd8dc-p2z8j" Apr 20 21:22:01.272169 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:01.272137 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/0017fa53-28a3-4fbd-a39c-9a40c5def571-tls-cert\") pod \"authorino-85ff7bd8dc-p2z8j\" (UID: \"0017fa53-28a3-4fbd-a39c-9a40c5def571\") " pod="kuadrant-system/authorino-85ff7bd8dc-p2z8j" Apr 20 21:22:01.272336 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:01.272250 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgfxq\" (UniqueName: \"kubernetes.io/projected/0017fa53-28a3-4fbd-a39c-9a40c5def571-kube-api-access-vgfxq\") pod \"authorino-85ff7bd8dc-p2z8j\" (UID: \"0017fa53-28a3-4fbd-a39c-9a40c5def571\") " pod="kuadrant-system/authorino-85ff7bd8dc-p2z8j" Apr 20 21:22:01.274654 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:01.274629 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/0017fa53-28a3-4fbd-a39c-9a40c5def571-tls-cert\") pod \"authorino-85ff7bd8dc-p2z8j\" (UID: \"0017fa53-28a3-4fbd-a39c-9a40c5def571\") " pod="kuadrant-system/authorino-85ff7bd8dc-p2z8j" Apr 20 21:22:01.282436 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:01.281964 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgfxq\" (UniqueName: \"kubernetes.io/projected/0017fa53-28a3-4fbd-a39c-9a40c5def571-kube-api-access-vgfxq\") pod \"authorino-85ff7bd8dc-p2z8j\" (UID: \"0017fa53-28a3-4fbd-a39c-9a40c5def571\") " pod="kuadrant-system/authorino-85ff7bd8dc-p2z8j" Apr 20 21:22:01.287254 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:01.287219 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-6c6bbc787f-6dnr4" event={"ID":"ba6543b2-957c-489c-8cbc-8edea2a8c16d","Type":"ContainerStarted","Data":"bed68fa226667afbe41af946097f75267e4cc78a4cd3b08bf4433b2d3bdbcd09"} Apr 20 21:22:01.288889 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:01.288858 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-mtrkd" event={"ID":"d2b525ef-b582-4d5a-80a9-01682810d764","Type":"ContainerStarted","Data":"28aa3f14ac060a3c8fef9b10521f0334628c0a4ebdd590bb6d2d75fdafc60169"} Apr 20 21:22:01.288984 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:01.288896 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-mtrkd" event={"ID":"d2b525ef-b582-4d5a-80a9-01682810d764","Type":"ContainerStarted","Data":"514d300475a48478f834d32375a785bde8f27fd80ebd0079430b95a4ffbdc789"} Apr 20 21:22:01.289024 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:01.288997 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-mtrkd" podUID="d2b525ef-b582-4d5a-80a9-01682810d764" containerName="authorino" containerID="cri-o://28aa3f14ac060a3c8fef9b10521f0334628c0a4ebdd590bb6d2d75fdafc60169" gracePeriod=30 Apr 20 21:22:01.303948 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:01.303904 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-mtrkd" podStartSLOduration=0.968777217 podStartE2EDuration="1.303889291s" podCreationTimestamp="2026-04-20 21:22:00 +0000 UTC" firstStartedPulling="2026-04-20 21:22:00.874466106 +0000 UTC m=+531.117425205" lastFinishedPulling="2026-04-20 21:22:01.209578178 +0000 UTC m=+531.452537279" observedRunningTime="2026-04-20 21:22:01.302080721 +0000 UTC m=+531.545039842" watchObservedRunningTime="2026-04-20 21:22:01.303889291 +0000 UTC m=+531.546848410" Apr 20 21:22:01.381249 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:01.381216 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-85ff7bd8dc-p2z8j" Apr 20 21:22:01.528482 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:01.528452 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-85ff7bd8dc-p2z8j"] Apr 20 21:22:01.530983 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:22:01.530955 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0017fa53_28a3_4fbd_a39c_9a40c5def571.slice/crio-92a60f82ffc4a04cc35ae11448efe595bf21b81282c44233b6c42c0f61f449f8 WatchSource:0}: Error finding container 92a60f82ffc4a04cc35ae11448efe595bf21b81282c44233b6c42c0f61f449f8: Status 404 returned error can't find the container with id 92a60f82ffc4a04cc35ae11448efe595bf21b81282c44233b6c42c0f61f449f8 Apr 20 21:22:01.537621 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:01.537602 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-mtrkd" Apr 20 21:22:01.575277 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:01.575165 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6c9g2\" (UniqueName: \"kubernetes.io/projected/d2b525ef-b582-4d5a-80a9-01682810d764-kube-api-access-6c9g2\") pod \"d2b525ef-b582-4d5a-80a9-01682810d764\" (UID: \"d2b525ef-b582-4d5a-80a9-01682810d764\") " Apr 20 21:22:01.577384 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:01.577356 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2b525ef-b582-4d5a-80a9-01682810d764-kube-api-access-6c9g2" (OuterVolumeSpecName: "kube-api-access-6c9g2") pod "d2b525ef-b582-4d5a-80a9-01682810d764" (UID: "d2b525ef-b582-4d5a-80a9-01682810d764"). InnerVolumeSpecName "kube-api-access-6c9g2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:22:01.676466 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:01.676441 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6c9g2\" (UniqueName: \"kubernetes.io/projected/d2b525ef-b582-4d5a-80a9-01682810d764-kube-api-access-6c9g2\") on node \"ip-10-0-129-149.ec2.internal\" DevicePath \"\"" Apr 20 21:22:02.293797 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:02.293758 2571 generic.go:358] "Generic (PLEG): container finished" podID="d2b525ef-b582-4d5a-80a9-01682810d764" containerID="28aa3f14ac060a3c8fef9b10521f0334628c0a4ebdd590bb6d2d75fdafc60169" exitCode=2 Apr 20 21:22:02.293947 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:02.293809 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-mtrkd" Apr 20 21:22:02.293947 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:02.293843 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-mtrkd" event={"ID":"d2b525ef-b582-4d5a-80a9-01682810d764","Type":"ContainerDied","Data":"28aa3f14ac060a3c8fef9b10521f0334628c0a4ebdd590bb6d2d75fdafc60169"} Apr 20 21:22:02.293947 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:02.293878 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-mtrkd" event={"ID":"d2b525ef-b582-4d5a-80a9-01682810d764","Type":"ContainerDied","Data":"514d300475a48478f834d32375a785bde8f27fd80ebd0079430b95a4ffbdc789"} Apr 20 21:22:02.293947 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:02.293899 2571 scope.go:117] "RemoveContainer" containerID="28aa3f14ac060a3c8fef9b10521f0334628c0a4ebdd590bb6d2d75fdafc60169" Apr 20 21:22:02.295378 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:02.295353 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-6c6bbc787f-6dnr4" event={"ID":"ba6543b2-957c-489c-8cbc-8edea2a8c16d","Type":"ContainerStarted","Data":"6caac2f1d576c91e8e51a5d37a6face84b38810ac9f0ccb62ff44b473cbf4b24"} Apr 20 21:22:02.295378 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:02.295348 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-6c6bbc787f-6dnr4" podUID="ba6543b2-957c-489c-8cbc-8edea2a8c16d" containerName="authorino" containerID="cri-o://6caac2f1d576c91e8e51a5d37a6face84b38810ac9f0ccb62ff44b473cbf4b24" gracePeriod=30 Apr 20 21:22:02.297234 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:02.297148 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-85ff7bd8dc-p2z8j" event={"ID":"0017fa53-28a3-4fbd-a39c-9a40c5def571","Type":"ContainerStarted","Data":"0a060ec9784afcc577542bb6734b3338c25b513df75ba7253e86f78897548ec4"} Apr 20 21:22:02.297234 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:02.297210 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-85ff7bd8dc-p2z8j" event={"ID":"0017fa53-28a3-4fbd-a39c-9a40c5def571","Type":"ContainerStarted","Data":"92a60f82ffc4a04cc35ae11448efe595bf21b81282c44233b6c42c0f61f449f8"} Apr 20 21:22:02.304433 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:02.304414 2571 scope.go:117] "RemoveContainer" containerID="28aa3f14ac060a3c8fef9b10521f0334628c0a4ebdd590bb6d2d75fdafc60169" Apr 20 21:22:02.304727 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:22:02.304704 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28aa3f14ac060a3c8fef9b10521f0334628c0a4ebdd590bb6d2d75fdafc60169\": container with ID starting with 28aa3f14ac060a3c8fef9b10521f0334628c0a4ebdd590bb6d2d75fdafc60169 not found: ID does not exist" containerID="28aa3f14ac060a3c8fef9b10521f0334628c0a4ebdd590bb6d2d75fdafc60169" Apr 20 21:22:02.304814 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:02.304734 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28aa3f14ac060a3c8fef9b10521f0334628c0a4ebdd590bb6d2d75fdafc60169"} err="failed to get container status \"28aa3f14ac060a3c8fef9b10521f0334628c0a4ebdd590bb6d2d75fdafc60169\": rpc error: code = NotFound desc = could not find container \"28aa3f14ac060a3c8fef9b10521f0334628c0a4ebdd590bb6d2d75fdafc60169\": container with ID starting with 28aa3f14ac060a3c8fef9b10521f0334628c0a4ebdd590bb6d2d75fdafc60169 not found: ID does not exist" Apr 20 21:22:02.314773 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:02.314735 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-6c6bbc787f-6dnr4" podStartSLOduration=1.8712212620000002 podStartE2EDuration="2.314723419s" podCreationTimestamp="2026-04-20 21:22:00 +0000 UTC" firstStartedPulling="2026-04-20 21:22:01.172235894 +0000 UTC m=+531.415194997" lastFinishedPulling="2026-04-20 21:22:01.615738053 +0000 UTC m=+531.858697154" observedRunningTime="2026-04-20 21:22:02.313499941 +0000 UTC m=+532.556459062" watchObservedRunningTime="2026-04-20 21:22:02.314723419 +0000 UTC m=+532.557682538" Apr 20 21:22:02.327950 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:02.327908 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-85ff7bd8dc-p2z8j" podStartSLOduration=0.812484992 podStartE2EDuration="1.327896951s" podCreationTimestamp="2026-04-20 21:22:01 +0000 UTC" firstStartedPulling="2026-04-20 21:22:01.532416463 +0000 UTC m=+531.775375564" lastFinishedPulling="2026-04-20 21:22:02.047828422 +0000 UTC m=+532.290787523" observedRunningTime="2026-04-20 21:22:02.326389036 +0000 UTC m=+532.569348157" watchObservedRunningTime="2026-04-20 21:22:02.327896951 +0000 UTC m=+532.570856070" Apr 20 21:22:02.342559 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:02.342532 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-mtrkd"] Apr 20 21:22:02.350862 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:02.350829 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-mtrkd"] Apr 20 21:22:02.353728 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:02.353698 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-s72vj"] Apr 20 21:22:02.353946 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:02.353912 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-s72vj" podUID="f1723fae-a935-4c50-847c-e61ed8770714" containerName="authorino" containerID="cri-o://02c8862ea4dd36fe74f1cfa4abf4e2fe4bde60b00abda43c40912bdc7d775162" gracePeriod=30 Apr 20 21:22:02.602363 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:02.602331 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6c6bbc787f-6dnr4" Apr 20 21:22:02.612585 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:02.612566 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-s72vj" Apr 20 21:22:02.684828 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:02.684804 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcv2r\" (UniqueName: \"kubernetes.io/projected/ba6543b2-957c-489c-8cbc-8edea2a8c16d-kube-api-access-pcv2r\") pod \"ba6543b2-957c-489c-8cbc-8edea2a8c16d\" (UID: \"ba6543b2-957c-489c-8cbc-8edea2a8c16d\") " Apr 20 21:22:02.684932 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:02.684861 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drv9t\" (UniqueName: \"kubernetes.io/projected/f1723fae-a935-4c50-847c-e61ed8770714-kube-api-access-drv9t\") pod \"f1723fae-a935-4c50-847c-e61ed8770714\" (UID: \"f1723fae-a935-4c50-847c-e61ed8770714\") " Apr 20 21:22:02.687047 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:02.687009 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1723fae-a935-4c50-847c-e61ed8770714-kube-api-access-drv9t" (OuterVolumeSpecName: "kube-api-access-drv9t") pod "f1723fae-a935-4c50-847c-e61ed8770714" (UID: "f1723fae-a935-4c50-847c-e61ed8770714"). InnerVolumeSpecName "kube-api-access-drv9t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:22:02.687143 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:02.687021 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba6543b2-957c-489c-8cbc-8edea2a8c16d-kube-api-access-pcv2r" (OuterVolumeSpecName: "kube-api-access-pcv2r") pod "ba6543b2-957c-489c-8cbc-8edea2a8c16d" (UID: "ba6543b2-957c-489c-8cbc-8edea2a8c16d"). InnerVolumeSpecName "kube-api-access-pcv2r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:22:02.786469 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:02.786432 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pcv2r\" (UniqueName: \"kubernetes.io/projected/ba6543b2-957c-489c-8cbc-8edea2a8c16d-kube-api-access-pcv2r\") on node \"ip-10-0-129-149.ec2.internal\" DevicePath \"\"" Apr 20 21:22:02.786469 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:02.786463 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-drv9t\" (UniqueName: \"kubernetes.io/projected/f1723fae-a935-4c50-847c-e61ed8770714-kube-api-access-drv9t\") on node \"ip-10-0-129-149.ec2.internal\" DevicePath \"\"" Apr 20 21:22:03.305469 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:03.305435 2571 generic.go:358] "Generic (PLEG): container finished" podID="f1723fae-a935-4c50-847c-e61ed8770714" containerID="02c8862ea4dd36fe74f1cfa4abf4e2fe4bde60b00abda43c40912bdc7d775162" exitCode=0 Apr 20 21:22:03.305632 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:03.305485 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-s72vj" Apr 20 21:22:03.305632 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:03.305516 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-s72vj" event={"ID":"f1723fae-a935-4c50-847c-e61ed8770714","Type":"ContainerDied","Data":"02c8862ea4dd36fe74f1cfa4abf4e2fe4bde60b00abda43c40912bdc7d775162"} Apr 20 21:22:03.305632 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:03.305546 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-s72vj" event={"ID":"f1723fae-a935-4c50-847c-e61ed8770714","Type":"ContainerDied","Data":"5d4814e6743640ca6506316cf0d59bf2916cd67ea0a77d00738c75a5db56a70d"} Apr 20 21:22:03.305632 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:03.305562 2571 scope.go:117] "RemoveContainer" containerID="02c8862ea4dd36fe74f1cfa4abf4e2fe4bde60b00abda43c40912bdc7d775162" Apr 20 21:22:03.307750 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:03.307729 2571 generic.go:358] "Generic (PLEG): container finished" podID="ba6543b2-957c-489c-8cbc-8edea2a8c16d" containerID="6caac2f1d576c91e8e51a5d37a6face84b38810ac9f0ccb62ff44b473cbf4b24" exitCode=0 Apr 20 21:22:03.307855 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:03.307789 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6c6bbc787f-6dnr4" Apr 20 21:22:03.307855 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:03.307823 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-6c6bbc787f-6dnr4" event={"ID":"ba6543b2-957c-489c-8cbc-8edea2a8c16d","Type":"ContainerDied","Data":"6caac2f1d576c91e8e51a5d37a6face84b38810ac9f0ccb62ff44b473cbf4b24"} Apr 20 21:22:03.307952 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:03.307859 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-6c6bbc787f-6dnr4" event={"ID":"ba6543b2-957c-489c-8cbc-8edea2a8c16d","Type":"ContainerDied","Data":"bed68fa226667afbe41af946097f75267e4cc78a4cd3b08bf4433b2d3bdbcd09"} Apr 20 21:22:03.321265 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:03.321236 2571 scope.go:117] "RemoveContainer" containerID="02c8862ea4dd36fe74f1cfa4abf4e2fe4bde60b00abda43c40912bdc7d775162" Apr 20 21:22:03.321568 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:22:03.321539 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02c8862ea4dd36fe74f1cfa4abf4e2fe4bde60b00abda43c40912bdc7d775162\": container with ID starting with 02c8862ea4dd36fe74f1cfa4abf4e2fe4bde60b00abda43c40912bdc7d775162 not found: ID does not exist" containerID="02c8862ea4dd36fe74f1cfa4abf4e2fe4bde60b00abda43c40912bdc7d775162" Apr 20 21:22:03.321683 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:03.321659 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02c8862ea4dd36fe74f1cfa4abf4e2fe4bde60b00abda43c40912bdc7d775162"} err="failed to get container status \"02c8862ea4dd36fe74f1cfa4abf4e2fe4bde60b00abda43c40912bdc7d775162\": rpc error: code = NotFound desc = could not find container \"02c8862ea4dd36fe74f1cfa4abf4e2fe4bde60b00abda43c40912bdc7d775162\": container with ID starting with 02c8862ea4dd36fe74f1cfa4abf4e2fe4bde60b00abda43c40912bdc7d775162 not found: ID does not exist" Apr 20 21:22:03.321767 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:03.321688 2571 scope.go:117] "RemoveContainer" containerID="6caac2f1d576c91e8e51a5d37a6face84b38810ac9f0ccb62ff44b473cbf4b24" Apr 20 21:22:03.329370 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:03.329354 2571 scope.go:117] "RemoveContainer" containerID="6caac2f1d576c91e8e51a5d37a6face84b38810ac9f0ccb62ff44b473cbf4b24" Apr 20 21:22:03.329637 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:22:03.329619 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6caac2f1d576c91e8e51a5d37a6face84b38810ac9f0ccb62ff44b473cbf4b24\": container with ID starting with 6caac2f1d576c91e8e51a5d37a6face84b38810ac9f0ccb62ff44b473cbf4b24 not found: ID does not exist" containerID="6caac2f1d576c91e8e51a5d37a6face84b38810ac9f0ccb62ff44b473cbf4b24" Apr 20 21:22:03.329678 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:03.329643 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6caac2f1d576c91e8e51a5d37a6face84b38810ac9f0ccb62ff44b473cbf4b24"} err="failed to get container status \"6caac2f1d576c91e8e51a5d37a6face84b38810ac9f0ccb62ff44b473cbf4b24\": rpc error: code = NotFound desc = could not find container \"6caac2f1d576c91e8e51a5d37a6face84b38810ac9f0ccb62ff44b473cbf4b24\": container with ID starting with 6caac2f1d576c91e8e51a5d37a6face84b38810ac9f0ccb62ff44b473cbf4b24 not found: ID does not exist" Apr 20 21:22:03.335094 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:03.335069 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-6c6bbc787f-6dnr4"] Apr 20 21:22:03.337862 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:03.337843 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-6c6bbc787f-6dnr4"] Apr 20 21:22:03.347049 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:03.347029 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-s72vj"] Apr 20 21:22:03.352673 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:03.352654 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-s72vj"] Apr 20 21:22:03.370955 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:03.370931 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-7d98fbbfd6-zrgwd"] Apr 20 21:22:03.371304 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:03.371291 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d2b525ef-b582-4d5a-80a9-01682810d764" containerName="authorino" Apr 20 21:22:03.371351 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:03.371306 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2b525ef-b582-4d5a-80a9-01682810d764" containerName="authorino" Apr 20 21:22:03.371351 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:03.371317 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1723fae-a935-4c50-847c-e61ed8770714" containerName="authorino" Apr 20 21:22:03.371351 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:03.371322 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1723fae-a935-4c50-847c-e61ed8770714" containerName="authorino" Apr 20 21:22:03.371351 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:03.371340 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba6543b2-957c-489c-8cbc-8edea2a8c16d" containerName="authorino" Apr 20 21:22:03.371351 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:03.371345 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba6543b2-957c-489c-8cbc-8edea2a8c16d" containerName="authorino" Apr 20 21:22:03.371487 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:03.371394 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="d2b525ef-b582-4d5a-80a9-01682810d764" containerName="authorino" Apr 20 21:22:03.371487 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:03.371404 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="ba6543b2-957c-489c-8cbc-8edea2a8c16d" containerName="authorino" Apr 20 21:22:03.371487 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:03.371411 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1723fae-a935-4c50-847c-e61ed8770714" containerName="authorino" Apr 20 21:22:03.375499 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:03.375484 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7d98fbbfd6-zrgwd" Apr 20 21:22:03.377447 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:03.377430 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-prkzx\"" Apr 20 21:22:03.382210 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:03.382172 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7d98fbbfd6-zrgwd"] Apr 20 21:22:03.492871 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:03.492837 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb8bz\" (UniqueName: \"kubernetes.io/projected/5c2a1054-7206-41ca-9226-d02a36d42349-kube-api-access-nb8bz\") pod \"maas-controller-7d98fbbfd6-zrgwd\" (UID: \"5c2a1054-7206-41ca-9226-d02a36d42349\") " pod="opendatahub/maas-controller-7d98fbbfd6-zrgwd" Apr 20 21:22:03.594004 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:03.593914 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nb8bz\" (UniqueName: \"kubernetes.io/projected/5c2a1054-7206-41ca-9226-d02a36d42349-kube-api-access-nb8bz\") pod \"maas-controller-7d98fbbfd6-zrgwd\" (UID: \"5c2a1054-7206-41ca-9226-d02a36d42349\") " pod="opendatahub/maas-controller-7d98fbbfd6-zrgwd" Apr 20 21:22:03.601935 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:03.601905 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb8bz\" (UniqueName: \"kubernetes.io/projected/5c2a1054-7206-41ca-9226-d02a36d42349-kube-api-access-nb8bz\") pod \"maas-controller-7d98fbbfd6-zrgwd\" (UID: \"5c2a1054-7206-41ca-9226-d02a36d42349\") " pod="opendatahub/maas-controller-7d98fbbfd6-zrgwd" Apr 20 21:22:03.686772 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:03.686735 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7d98fbbfd6-zrgwd" Apr 20 21:22:03.812862 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:03.812829 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7d98fbbfd6-zrgwd"] Apr 20 21:22:03.816150 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:22:03.816117 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c2a1054_7206_41ca_9226_d02a36d42349.slice/crio-fb896377319c6725eba02d8c4511bcf9dd8dda3b8b58b64b3d1dd7495453652b WatchSource:0}: Error finding container fb896377319c6725eba02d8c4511bcf9dd8dda3b8b58b64b3d1dd7495453652b: Status 404 returned error can't find the container with id fb896377319c6725eba02d8c4511bcf9dd8dda3b8b58b64b3d1dd7495453652b Apr 20 21:22:04.314462 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:04.314423 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7d98fbbfd6-zrgwd" event={"ID":"5c2a1054-7206-41ca-9226-d02a36d42349","Type":"ContainerStarted","Data":"fb896377319c6725eba02d8c4511bcf9dd8dda3b8b58b64b3d1dd7495453652b"} Apr 20 21:22:04.342753 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:04.342710 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba6543b2-957c-489c-8cbc-8edea2a8c16d" path="/var/lib/kubelet/pods/ba6543b2-957c-489c-8cbc-8edea2a8c16d/volumes" Apr 20 21:22:04.343211 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:04.343169 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2b525ef-b582-4d5a-80a9-01682810d764" path="/var/lib/kubelet/pods/d2b525ef-b582-4d5a-80a9-01682810d764/volumes" Apr 20 21:22:04.343598 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:04.343582 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1723fae-a935-4c50-847c-e61ed8770714" path="/var/lib/kubelet/pods/f1723fae-a935-4c50-847c-e61ed8770714/volumes" Apr 20 21:22:06.324193 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:06.324155 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7d98fbbfd6-zrgwd" event={"ID":"5c2a1054-7206-41ca-9226-d02a36d42349","Type":"ContainerStarted","Data":"2a45ea1322b5133858e69cf762d0a285f9e8bfa4c8240b9a86f4302edd40ae75"} Apr 20 21:22:06.324588 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:06.324245 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-7d98fbbfd6-zrgwd" Apr 20 21:22:06.340939 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:06.340893 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-7d98fbbfd6-zrgwd" podStartSLOduration=0.982764388 podStartE2EDuration="3.340877317s" podCreationTimestamp="2026-04-20 21:22:03 +0000 UTC" firstStartedPulling="2026-04-20 21:22:03.817618744 +0000 UTC m=+534.060577846" lastFinishedPulling="2026-04-20 21:22:06.175731675 +0000 UTC m=+536.418690775" observedRunningTime="2026-04-20 21:22:06.33873936 +0000 UTC m=+536.581698481" watchObservedRunningTime="2026-04-20 21:22:06.340877317 +0000 UTC m=+536.583836437" Apr 20 21:22:17.333274 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:17.333243 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-7d98fbbfd6-zrgwd" Apr 20 21:22:18.014151 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:18.014104 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-7d98fbbfd6-zrgwd"] Apr 20 21:22:18.014408 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:18.014382 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-7d98fbbfd6-zrgwd" podUID="5c2a1054-7206-41ca-9226-d02a36d42349" containerName="manager" containerID="cri-o://2a45ea1322b5133858e69cf762d0a285f9e8bfa4c8240b9a86f4302edd40ae75" gracePeriod=10 Apr 20 21:22:18.259323 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:18.259298 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7d98fbbfd6-zrgwd" Apr 20 21:22:18.379164 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:18.379080 2571 generic.go:358] "Generic (PLEG): container finished" podID="5c2a1054-7206-41ca-9226-d02a36d42349" containerID="2a45ea1322b5133858e69cf762d0a285f9e8bfa4c8240b9a86f4302edd40ae75" exitCode=0 Apr 20 21:22:18.379164 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:18.379119 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7d98fbbfd6-zrgwd" event={"ID":"5c2a1054-7206-41ca-9226-d02a36d42349","Type":"ContainerDied","Data":"2a45ea1322b5133858e69cf762d0a285f9e8bfa4c8240b9a86f4302edd40ae75"} Apr 20 21:22:18.379164 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:18.379138 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7d98fbbfd6-zrgwd" Apr 20 21:22:18.379164 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:18.379157 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7d98fbbfd6-zrgwd" event={"ID":"5c2a1054-7206-41ca-9226-d02a36d42349","Type":"ContainerDied","Data":"fb896377319c6725eba02d8c4511bcf9dd8dda3b8b58b64b3d1dd7495453652b"} Apr 20 21:22:18.379667 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:18.379174 2571 scope.go:117] "RemoveContainer" containerID="2a45ea1322b5133858e69cf762d0a285f9e8bfa4c8240b9a86f4302edd40ae75" Apr 20 21:22:18.387661 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:18.387643 2571 scope.go:117] "RemoveContainer" containerID="2a45ea1322b5133858e69cf762d0a285f9e8bfa4c8240b9a86f4302edd40ae75" Apr 20 21:22:18.387909 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:22:18.387889 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a45ea1322b5133858e69cf762d0a285f9e8bfa4c8240b9a86f4302edd40ae75\": container with ID starting with 2a45ea1322b5133858e69cf762d0a285f9e8bfa4c8240b9a86f4302edd40ae75 not found: ID does not exist" containerID="2a45ea1322b5133858e69cf762d0a285f9e8bfa4c8240b9a86f4302edd40ae75" Apr 20 21:22:18.387968 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:18.387923 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a45ea1322b5133858e69cf762d0a285f9e8bfa4c8240b9a86f4302edd40ae75"} err="failed to get container status \"2a45ea1322b5133858e69cf762d0a285f9e8bfa4c8240b9a86f4302edd40ae75\": rpc error: code = NotFound desc = could not find container \"2a45ea1322b5133858e69cf762d0a285f9e8bfa4c8240b9a86f4302edd40ae75\": container with ID starting with 2a45ea1322b5133858e69cf762d0a285f9e8bfa4c8240b9a86f4302edd40ae75 not found: ID does not exist" Apr 20 21:22:18.428249 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:18.428217 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nb8bz\" (UniqueName: \"kubernetes.io/projected/5c2a1054-7206-41ca-9226-d02a36d42349-kube-api-access-nb8bz\") pod \"5c2a1054-7206-41ca-9226-d02a36d42349\" (UID: \"5c2a1054-7206-41ca-9226-d02a36d42349\") " Apr 20 21:22:18.430568 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:18.430531 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c2a1054-7206-41ca-9226-d02a36d42349-kube-api-access-nb8bz" (OuterVolumeSpecName: "kube-api-access-nb8bz") pod "5c2a1054-7206-41ca-9226-d02a36d42349" (UID: "5c2a1054-7206-41ca-9226-d02a36d42349"). InnerVolumeSpecName "kube-api-access-nb8bz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:22:18.529507 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:18.529472 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nb8bz\" (UniqueName: \"kubernetes.io/projected/5c2a1054-7206-41ca-9226-d02a36d42349-kube-api-access-nb8bz\") on node \"ip-10-0-129-149.ec2.internal\" DevicePath \"\"" Apr 20 21:22:18.706031 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:18.705998 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-7d98fbbfd6-zrgwd"] Apr 20 21:22:18.709780 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:18.709752 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-7d98fbbfd6-zrgwd"] Apr 20 21:22:20.341578 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:20.341543 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c2a1054-7206-41ca-9226-d02a36d42349" path="/var/lib/kubelet/pods/5c2a1054-7206-41ca-9226-d02a36d42349/volumes" Apr 20 21:22:26.188147 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:26.188103 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-6b46858f4d-nc29c"] Apr 20 21:22:26.188717 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:26.188655 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c2a1054-7206-41ca-9226-d02a36d42349" containerName="manager" Apr 20 21:22:26.188717 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:26.188678 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c2a1054-7206-41ca-9226-d02a36d42349" containerName="manager" Apr 20 21:22:26.188864 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:26.188772 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5c2a1054-7206-41ca-9226-d02a36d42349" containerName="manager" Apr 20 21:22:26.193083 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:26.193064 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6b46858f4d-nc29c" Apr 20 21:22:26.195419 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:26.195397 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 20 21:22:26.195419 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:26.195413 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-nd7zd\"" Apr 20 21:22:26.195566 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:26.195396 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 20 21:22:26.203475 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:26.203453 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-6b46858f4d-nc29c"] Apr 20 21:22:26.302928 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:26.302881 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/0d7b4fd1-1f34-48c3-870d-260976533cb7-maas-api-tls\") pod \"maas-api-6b46858f4d-nc29c\" (UID: \"0d7b4fd1-1f34-48c3-870d-260976533cb7\") " pod="opendatahub/maas-api-6b46858f4d-nc29c" Apr 20 21:22:26.303102 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:26.302940 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn8qq\" (UniqueName: \"kubernetes.io/projected/0d7b4fd1-1f34-48c3-870d-260976533cb7-kube-api-access-kn8qq\") pod \"maas-api-6b46858f4d-nc29c\" (UID: \"0d7b4fd1-1f34-48c3-870d-260976533cb7\") " pod="opendatahub/maas-api-6b46858f4d-nc29c" Apr 20 21:22:26.403712 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:26.403667 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/0d7b4fd1-1f34-48c3-870d-260976533cb7-maas-api-tls\") pod \"maas-api-6b46858f4d-nc29c\" (UID: \"0d7b4fd1-1f34-48c3-870d-260976533cb7\") " pod="opendatahub/maas-api-6b46858f4d-nc29c" Apr 20 21:22:26.403878 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:26.403728 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kn8qq\" (UniqueName: \"kubernetes.io/projected/0d7b4fd1-1f34-48c3-870d-260976533cb7-kube-api-access-kn8qq\") pod \"maas-api-6b46858f4d-nc29c\" (UID: \"0d7b4fd1-1f34-48c3-870d-260976533cb7\") " pod="opendatahub/maas-api-6b46858f4d-nc29c" Apr 20 21:22:26.406437 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:26.406406 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/0d7b4fd1-1f34-48c3-870d-260976533cb7-maas-api-tls\") pod \"maas-api-6b46858f4d-nc29c\" (UID: \"0d7b4fd1-1f34-48c3-870d-260976533cb7\") " pod="opendatahub/maas-api-6b46858f4d-nc29c" Apr 20 21:22:26.416544 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:26.416516 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn8qq\" (UniqueName: \"kubernetes.io/projected/0d7b4fd1-1f34-48c3-870d-260976533cb7-kube-api-access-kn8qq\") pod \"maas-api-6b46858f4d-nc29c\" (UID: \"0d7b4fd1-1f34-48c3-870d-260976533cb7\") " pod="opendatahub/maas-api-6b46858f4d-nc29c" Apr 20 21:22:26.504799 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:26.504759 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6b46858f4d-nc29c" Apr 20 21:22:26.641880 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:26.641745 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-6b46858f4d-nc29c"] Apr 20 21:22:26.644853 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:22:26.644824 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d7b4fd1_1f34_48c3_870d_260976533cb7.slice/crio-98fe5a8973816df192f1d53a6bfecb8b38bda22c160eeedac1f0552df370b739 WatchSource:0}: Error finding container 98fe5a8973816df192f1d53a6bfecb8b38bda22c160eeedac1f0552df370b739: Status 404 returned error can't find the container with id 98fe5a8973816df192f1d53a6bfecb8b38bda22c160eeedac1f0552df370b739 Apr 20 21:22:27.419473 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:27.419431 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6b46858f4d-nc29c" event={"ID":"0d7b4fd1-1f34-48c3-870d-260976533cb7","Type":"ContainerStarted","Data":"98fe5a8973816df192f1d53a6bfecb8b38bda22c160eeedac1f0552df370b739"} Apr 20 21:22:29.428791 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:29.428754 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6b46858f4d-nc29c" event={"ID":"0d7b4fd1-1f34-48c3-870d-260976533cb7","Type":"ContainerStarted","Data":"588a93fecf43a8f1a2cd1aa84755e00bcb9bd1c3e43cc633d47afeb0789e5f5c"} Apr 20 21:22:29.429201 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:29.428804 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-6b46858f4d-nc29c" Apr 20 21:22:29.443141 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:29.443077 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-6b46858f4d-nc29c" podStartSLOduration=1.573070923 podStartE2EDuration="3.443061455s" podCreationTimestamp="2026-04-20 21:22:26 +0000 UTC" firstStartedPulling="2026-04-20 21:22:26.646024292 +0000 UTC m=+556.888983389" lastFinishedPulling="2026-04-20 21:22:28.51601481 +0000 UTC m=+558.758973921" observedRunningTime="2026-04-20 21:22:29.44220334 +0000 UTC m=+559.685162451" watchObservedRunningTime="2026-04-20 21:22:29.443061455 +0000 UTC m=+559.686020575" Apr 20 21:22:35.437661 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:35.437632 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-6b46858f4d-nc29c" Apr 20 21:22:58.974560 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:58.974523 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-6b46858f4d-nc29c"] Apr 20 21:22:58.975018 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:58.974840 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-6b46858f4d-nc29c" podUID="0d7b4fd1-1f34-48c3-870d-260976533cb7" containerName="maas-api" containerID="cri-o://588a93fecf43a8f1a2cd1aa84755e00bcb9bd1c3e43cc633d47afeb0789e5f5c" gracePeriod=30 Apr 20 21:22:59.220099 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:59.220077 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6b46858f4d-nc29c" Apr 20 21:22:59.298841 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:59.298807 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/0d7b4fd1-1f34-48c3-870d-260976533cb7-maas-api-tls\") pod \"0d7b4fd1-1f34-48c3-870d-260976533cb7\" (UID: \"0d7b4fd1-1f34-48c3-870d-260976533cb7\") " Apr 20 21:22:59.298841 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:59.298845 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kn8qq\" (UniqueName: \"kubernetes.io/projected/0d7b4fd1-1f34-48c3-870d-260976533cb7-kube-api-access-kn8qq\") pod \"0d7b4fd1-1f34-48c3-870d-260976533cb7\" (UID: \"0d7b4fd1-1f34-48c3-870d-260976533cb7\") " Apr 20 21:22:59.301045 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:59.301006 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d7b4fd1-1f34-48c3-870d-260976533cb7-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "0d7b4fd1-1f34-48c3-870d-260976533cb7" (UID: "0d7b4fd1-1f34-48c3-870d-260976533cb7"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:22:59.301045 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:59.301020 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d7b4fd1-1f34-48c3-870d-260976533cb7-kube-api-access-kn8qq" (OuterVolumeSpecName: "kube-api-access-kn8qq") pod "0d7b4fd1-1f34-48c3-870d-260976533cb7" (UID: "0d7b4fd1-1f34-48c3-870d-260976533cb7"). InnerVolumeSpecName "kube-api-access-kn8qq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:22:59.400138 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:59.400101 2571 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/0d7b4fd1-1f34-48c3-870d-260976533cb7-maas-api-tls\") on node \"ip-10-0-129-149.ec2.internal\" DevicePath \"\"" Apr 20 21:22:59.400138 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:59.400132 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kn8qq\" (UniqueName: \"kubernetes.io/projected/0d7b4fd1-1f34-48c3-870d-260976533cb7-kube-api-access-kn8qq\") on node \"ip-10-0-129-149.ec2.internal\" DevicePath \"\"" Apr 20 21:22:59.537925 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:59.537886 2571 generic.go:358] "Generic (PLEG): container finished" podID="0d7b4fd1-1f34-48c3-870d-260976533cb7" containerID="588a93fecf43a8f1a2cd1aa84755e00bcb9bd1c3e43cc633d47afeb0789e5f5c" exitCode=0 Apr 20 21:22:59.538083 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:59.537947 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6b46858f4d-nc29c" Apr 20 21:22:59.538083 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:59.537944 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6b46858f4d-nc29c" event={"ID":"0d7b4fd1-1f34-48c3-870d-260976533cb7","Type":"ContainerDied","Data":"588a93fecf43a8f1a2cd1aa84755e00bcb9bd1c3e43cc633d47afeb0789e5f5c"} Apr 20 21:22:59.538083 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:59.538056 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6b46858f4d-nc29c" event={"ID":"0d7b4fd1-1f34-48c3-870d-260976533cb7","Type":"ContainerDied","Data":"98fe5a8973816df192f1d53a6bfecb8b38bda22c160eeedac1f0552df370b739"} Apr 20 21:22:59.538083 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:59.538075 2571 scope.go:117] "RemoveContainer" containerID="588a93fecf43a8f1a2cd1aa84755e00bcb9bd1c3e43cc633d47afeb0789e5f5c" Apr 20 21:22:59.547128 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:59.547108 2571 scope.go:117] "RemoveContainer" containerID="588a93fecf43a8f1a2cd1aa84755e00bcb9bd1c3e43cc633d47afeb0789e5f5c" Apr 20 21:22:59.547390 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:22:59.547373 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"588a93fecf43a8f1a2cd1aa84755e00bcb9bd1c3e43cc633d47afeb0789e5f5c\": container with ID starting with 588a93fecf43a8f1a2cd1aa84755e00bcb9bd1c3e43cc633d47afeb0789e5f5c not found: ID does not exist" containerID="588a93fecf43a8f1a2cd1aa84755e00bcb9bd1c3e43cc633d47afeb0789e5f5c" Apr 20 21:22:59.547436 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:59.547402 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"588a93fecf43a8f1a2cd1aa84755e00bcb9bd1c3e43cc633d47afeb0789e5f5c"} err="failed to get container status \"588a93fecf43a8f1a2cd1aa84755e00bcb9bd1c3e43cc633d47afeb0789e5f5c\": rpc error: code = NotFound desc = could not find container \"588a93fecf43a8f1a2cd1aa84755e00bcb9bd1c3e43cc633d47afeb0789e5f5c\": container with ID starting with 588a93fecf43a8f1a2cd1aa84755e00bcb9bd1c3e43cc633d47afeb0789e5f5c not found: ID does not exist" Apr 20 21:22:59.558287 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:59.558208 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-6b46858f4d-nc29c"] Apr 20 21:22:59.560066 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:22:59.560045 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-6b46858f4d-nc29c"] Apr 20 21:23:00.343355 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:00.343326 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d7b4fd1-1f34-48c3-870d-260976533cb7" path="/var/lib/kubelet/pods/0d7b4fd1-1f34-48c3-870d-260976533cb7/volumes" Apr 20 21:23:08.430809 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:08.430775 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7"] Apr 20 21:23:08.431268 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:08.431127 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d7b4fd1-1f34-48c3-870d-260976533cb7" containerName="maas-api" Apr 20 21:23:08.431268 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:08.431137 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d7b4fd1-1f34-48c3-870d-260976533cb7" containerName="maas-api" Apr 20 21:23:08.431268 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:08.431221 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="0d7b4fd1-1f34-48c3-870d-260976533cb7" containerName="maas-api" Apr 20 21:23:08.433838 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:08.433821 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7" Apr 20 21:23:08.435929 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:08.435900 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 20 21:23:08.436516 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:08.436496 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 20 21:23:08.436612 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:08.436549 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 20 21:23:08.436612 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:08.436565 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-rl5wt\"" Apr 20 21:23:08.444320 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:08.444296 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7"] Apr 20 21:23:08.584678 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:08.584635 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/36c013e0-a4ff-4ced-9363-069ce4cc1c44-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7\" (UID: \"36c013e0-a4ff-4ced-9363-069ce4cc1c44\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7" Apr 20 21:23:08.584678 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:08.584678 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6p5p\" (UniqueName: \"kubernetes.io/projected/36c013e0-a4ff-4ced-9363-069ce4cc1c44-kube-api-access-k6p5p\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7\" (UID: \"36c013e0-a4ff-4ced-9363-069ce4cc1c44\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7" Apr 20 21:23:08.584921 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:08.584712 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/36c013e0-a4ff-4ced-9363-069ce4cc1c44-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7\" (UID: \"36c013e0-a4ff-4ced-9363-069ce4cc1c44\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7" Apr 20 21:23:08.584921 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:08.584760 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/36c013e0-a4ff-4ced-9363-069ce4cc1c44-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7\" (UID: \"36c013e0-a4ff-4ced-9363-069ce4cc1c44\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7" Apr 20 21:23:08.584921 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:08.584780 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/36c013e0-a4ff-4ced-9363-069ce4cc1c44-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7\" (UID: \"36c013e0-a4ff-4ced-9363-069ce4cc1c44\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7" Apr 20 21:23:08.584921 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:08.584808 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/36c013e0-a4ff-4ced-9363-069ce4cc1c44-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7\" (UID: \"36c013e0-a4ff-4ced-9363-069ce4cc1c44\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7" Apr 20 21:23:08.686366 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:08.686239 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/36c013e0-a4ff-4ced-9363-069ce4cc1c44-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7\" (UID: \"36c013e0-a4ff-4ced-9363-069ce4cc1c44\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7" Apr 20 21:23:08.686366 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:08.686308 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/36c013e0-a4ff-4ced-9363-069ce4cc1c44-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7\" (UID: \"36c013e0-a4ff-4ced-9363-069ce4cc1c44\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7" Apr 20 21:23:08.686366 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:08.686335 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/36c013e0-a4ff-4ced-9363-069ce4cc1c44-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7\" (UID: \"36c013e0-a4ff-4ced-9363-069ce4cc1c44\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7" Apr 20 21:23:08.686638 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:08.686395 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/36c013e0-a4ff-4ced-9363-069ce4cc1c44-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7\" (UID: \"36c013e0-a4ff-4ced-9363-069ce4cc1c44\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7" Apr 20 21:23:08.686638 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:08.686454 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k6p5p\" (UniqueName: \"kubernetes.io/projected/36c013e0-a4ff-4ced-9363-069ce4cc1c44-kube-api-access-k6p5p\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7\" (UID: \"36c013e0-a4ff-4ced-9363-069ce4cc1c44\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7" Apr 20 21:23:08.686638 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:08.686507 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/36c013e0-a4ff-4ced-9363-069ce4cc1c44-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7\" (UID: \"36c013e0-a4ff-4ced-9363-069ce4cc1c44\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7" Apr 20 21:23:08.686812 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:08.686785 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/36c013e0-a4ff-4ced-9363-069ce4cc1c44-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7\" (UID: \"36c013e0-a4ff-4ced-9363-069ce4cc1c44\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7" Apr 20 21:23:08.686859 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:08.686811 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/36c013e0-a4ff-4ced-9363-069ce4cc1c44-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7\" (UID: \"36c013e0-a4ff-4ced-9363-069ce4cc1c44\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7" Apr 20 21:23:08.686891 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:08.686851 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/36c013e0-a4ff-4ced-9363-069ce4cc1c44-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7\" (UID: \"36c013e0-a4ff-4ced-9363-069ce4cc1c44\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7" Apr 20 21:23:08.688668 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:08.688637 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/36c013e0-a4ff-4ced-9363-069ce4cc1c44-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7\" (UID: \"36c013e0-a4ff-4ced-9363-069ce4cc1c44\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7" Apr 20 21:23:08.689029 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:08.689005 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/36c013e0-a4ff-4ced-9363-069ce4cc1c44-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7\" (UID: \"36c013e0-a4ff-4ced-9363-069ce4cc1c44\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7" Apr 20 21:23:08.694046 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:08.694024 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6p5p\" (UniqueName: \"kubernetes.io/projected/36c013e0-a4ff-4ced-9363-069ce4cc1c44-kube-api-access-k6p5p\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7\" (UID: \"36c013e0-a4ff-4ced-9363-069ce4cc1c44\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7" Apr 20 21:23:08.744863 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:08.744820 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7" Apr 20 21:23:08.881311 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:08.881283 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7"] Apr 20 21:23:08.883373 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:23:08.883343 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36c013e0_a4ff_4ced_9363_069ce4cc1c44.slice/crio-1b7d673aad6bd9bd48b9d3f4b8bdb72d2cc187dac72429cbacbd02ef767a5355 WatchSource:0}: Error finding container 1b7d673aad6bd9bd48b9d3f4b8bdb72d2cc187dac72429cbacbd02ef767a5355: Status 404 returned error can't find the container with id 1b7d673aad6bd9bd48b9d3f4b8bdb72d2cc187dac72429cbacbd02ef767a5355 Apr 20 21:23:09.577681 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:09.577641 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7" event={"ID":"36c013e0-a4ff-4ced-9363-069ce4cc1c44","Type":"ContainerStarted","Data":"1b7d673aad6bd9bd48b9d3f4b8bdb72d2cc187dac72429cbacbd02ef767a5355"} Apr 20 21:23:10.235922 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:10.235889 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hjvgw_838a3bd4-1a50-4127-a629-525bfede6ffd/console-operator/1.log" Apr 20 21:23:10.238021 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:10.237995 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hjvgw_838a3bd4-1a50-4127-a629-525bfede6ffd/console-operator/1.log" Apr 20 21:23:10.242257 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:10.242232 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6skp_82c75868-1659-4814-b726-ba733f5f2ebc/ovn-acl-logging/0.log" Apr 20 21:23:10.243927 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:10.243906 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6skp_82c75868-1659-4814-b726-ba733f5f2ebc/ovn-acl-logging/0.log" Apr 20 21:23:14.598427 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:14.598385 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7" event={"ID":"36c013e0-a4ff-4ced-9363-069ce4cc1c44","Type":"ContainerStarted","Data":"0c9b24c0397abcf465704bd4c938d45ed07ed9e3125a0b05325dd6d70a35af03"} Apr 20 21:23:20.621910 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:20.621873 2571 generic.go:358] "Generic (PLEG): container finished" podID="36c013e0-a4ff-4ced-9363-069ce4cc1c44" containerID="0c9b24c0397abcf465704bd4c938d45ed07ed9e3125a0b05325dd6d70a35af03" exitCode=0 Apr 20 21:23:20.622359 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:20.621950 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7" event={"ID":"36c013e0-a4ff-4ced-9363-069ce4cc1c44","Type":"ContainerDied","Data":"0c9b24c0397abcf465704bd4c938d45ed07ed9e3125a0b05325dd6d70a35af03"} Apr 20 21:23:20.622628 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:20.622611 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 21:23:22.632507 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:22.632470 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7" event={"ID":"36c013e0-a4ff-4ced-9363-069ce4cc1c44","Type":"ContainerStarted","Data":"8292a634232cbdceb7236d9556f0718ffa397088e883dc32ed696ff33c2c20fe"} Apr 20 21:23:22.632912 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:22.632692 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7" Apr 20 21:23:22.651014 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:22.650955 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7" podStartSLOduration=1.8146909039999999 podStartE2EDuration="14.650938419s" podCreationTimestamp="2026-04-20 21:23:08 +0000 UTC" firstStartedPulling="2026-04-20 21:23:08.885255548 +0000 UTC m=+599.128214647" lastFinishedPulling="2026-04-20 21:23:21.721503061 +0000 UTC m=+611.964462162" observedRunningTime="2026-04-20 21:23:22.648619777 +0000 UTC m=+612.891578897" watchObservedRunningTime="2026-04-20 21:23:22.650938419 +0000 UTC m=+612.893897538" Apr 20 21:23:33.649564 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:23:33.649478 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7" Apr 20 21:24:25.284855 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:24:25.284809 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7d5fd76f94-q8rtt"] Apr 20 21:24:25.288665 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:24:25.288636 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7d5fd76f94-q8rtt" Apr 20 21:24:25.294829 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:24:25.294797 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7d5fd76f94-q8rtt"] Apr 20 21:24:25.393681 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:24:25.393638 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5hfc\" (UniqueName: \"kubernetes.io/projected/f76b5c7f-0653-4eab-b706-ebff9786f2e5-kube-api-access-s5hfc\") pod \"authorino-7d5fd76f94-q8rtt\" (UID: \"f76b5c7f-0653-4eab-b706-ebff9786f2e5\") " pod="kuadrant-system/authorino-7d5fd76f94-q8rtt" Apr 20 21:24:25.393863 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:24:25.393782 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/f76b5c7f-0653-4eab-b706-ebff9786f2e5-tls-cert\") pod \"authorino-7d5fd76f94-q8rtt\" (UID: \"f76b5c7f-0653-4eab-b706-ebff9786f2e5\") " pod="kuadrant-system/authorino-7d5fd76f94-q8rtt" Apr 20 21:24:25.495014 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:24:25.494971 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/f76b5c7f-0653-4eab-b706-ebff9786f2e5-tls-cert\") pod \"authorino-7d5fd76f94-q8rtt\" (UID: \"f76b5c7f-0653-4eab-b706-ebff9786f2e5\") " pod="kuadrant-system/authorino-7d5fd76f94-q8rtt" Apr 20 21:24:25.495154 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:24:25.495034 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5hfc\" (UniqueName: \"kubernetes.io/projected/f76b5c7f-0653-4eab-b706-ebff9786f2e5-kube-api-access-s5hfc\") pod \"authorino-7d5fd76f94-q8rtt\" (UID: \"f76b5c7f-0653-4eab-b706-ebff9786f2e5\") " pod="kuadrant-system/authorino-7d5fd76f94-q8rtt" Apr 20 21:24:25.497687 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:24:25.497663 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/f76b5c7f-0653-4eab-b706-ebff9786f2e5-tls-cert\") pod \"authorino-7d5fd76f94-q8rtt\" (UID: \"f76b5c7f-0653-4eab-b706-ebff9786f2e5\") " pod="kuadrant-system/authorino-7d5fd76f94-q8rtt" Apr 20 21:24:25.503611 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:24:25.503580 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5hfc\" (UniqueName: \"kubernetes.io/projected/f76b5c7f-0653-4eab-b706-ebff9786f2e5-kube-api-access-s5hfc\") pod \"authorino-7d5fd76f94-q8rtt\" (UID: \"f76b5c7f-0653-4eab-b706-ebff9786f2e5\") " pod="kuadrant-system/authorino-7d5fd76f94-q8rtt" Apr 20 21:24:25.600530 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:24:25.600428 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7d5fd76f94-q8rtt" Apr 20 21:24:25.735888 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:24:25.735853 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7d5fd76f94-q8rtt"] Apr 20 21:24:25.738806 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:24:25.738771 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf76b5c7f_0653_4eab_b706_ebff9786f2e5.slice/crio-0ed2ec5b120e55e2dc0f1e54a0f0c6f50618a2fbf631c220cf168f78940dba03 WatchSource:0}: Error finding container 0ed2ec5b120e55e2dc0f1e54a0f0c6f50618a2fbf631c220cf168f78940dba03: Status 404 returned error can't find the container with id 0ed2ec5b120e55e2dc0f1e54a0f0c6f50618a2fbf631c220cf168f78940dba03 Apr 20 21:24:25.872559 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:24:25.872465 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7d5fd76f94-q8rtt" event={"ID":"f76b5c7f-0653-4eab-b706-ebff9786f2e5","Type":"ContainerStarted","Data":"0ed2ec5b120e55e2dc0f1e54a0f0c6f50618a2fbf631c220cf168f78940dba03"} Apr 20 21:24:26.877694 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:24:26.877654 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7d5fd76f94-q8rtt" event={"ID":"f76b5c7f-0653-4eab-b706-ebff9786f2e5","Type":"ContainerStarted","Data":"dc7f00b64692e28c95531cba140e6c6d27e4a9f6e1c57bfbfdff81d66e51bc2b"} Apr 20 21:24:26.892384 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:24:26.892323 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7d5fd76f94-q8rtt" podStartSLOduration=1.452887071 podStartE2EDuration="1.892305961s" podCreationTimestamp="2026-04-20 21:24:25 +0000 UTC" firstStartedPulling="2026-04-20 21:24:25.740118812 +0000 UTC m=+675.983077910" lastFinishedPulling="2026-04-20 21:24:26.179537699 +0000 UTC m=+676.422496800" observedRunningTime="2026-04-20 21:24:26.891321623 +0000 UTC m=+677.134280740" watchObservedRunningTime="2026-04-20 21:24:26.892305961 +0000 UTC m=+677.135265080" Apr 20 21:24:26.917728 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:24:26.917680 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-85ff7bd8dc-p2z8j"] Apr 20 21:24:26.918348 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:24:26.918305 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-85ff7bd8dc-p2z8j" podUID="0017fa53-28a3-4fbd-a39c-9a40c5def571" containerName="authorino" containerID="cri-o://0a060ec9784afcc577542bb6734b3338c25b513df75ba7253e86f78897548ec4" gracePeriod=30 Apr 20 21:24:27.188661 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:24:27.188635 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-85ff7bd8dc-p2z8j" Apr 20 21:24:27.311678 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:24:27.311640 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgfxq\" (UniqueName: \"kubernetes.io/projected/0017fa53-28a3-4fbd-a39c-9a40c5def571-kube-api-access-vgfxq\") pod \"0017fa53-28a3-4fbd-a39c-9a40c5def571\" (UID: \"0017fa53-28a3-4fbd-a39c-9a40c5def571\") " Apr 20 21:24:27.311857 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:24:27.311690 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/0017fa53-28a3-4fbd-a39c-9a40c5def571-tls-cert\") pod \"0017fa53-28a3-4fbd-a39c-9a40c5def571\" (UID: \"0017fa53-28a3-4fbd-a39c-9a40c5def571\") " Apr 20 21:24:27.314017 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:24:27.313984 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0017fa53-28a3-4fbd-a39c-9a40c5def571-kube-api-access-vgfxq" (OuterVolumeSpecName: "kube-api-access-vgfxq") pod "0017fa53-28a3-4fbd-a39c-9a40c5def571" (UID: "0017fa53-28a3-4fbd-a39c-9a40c5def571"). InnerVolumeSpecName "kube-api-access-vgfxq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:24:27.323691 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:24:27.323653 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0017fa53-28a3-4fbd-a39c-9a40c5def571-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "0017fa53-28a3-4fbd-a39c-9a40c5def571" (UID: "0017fa53-28a3-4fbd-a39c-9a40c5def571"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:24:27.412728 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:24:27.412688 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vgfxq\" (UniqueName: \"kubernetes.io/projected/0017fa53-28a3-4fbd-a39c-9a40c5def571-kube-api-access-vgfxq\") on node \"ip-10-0-129-149.ec2.internal\" DevicePath \"\"" Apr 20 21:24:27.412728 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:24:27.412727 2571 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/0017fa53-28a3-4fbd-a39c-9a40c5def571-tls-cert\") on node \"ip-10-0-129-149.ec2.internal\" DevicePath \"\"" Apr 20 21:24:27.882437 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:24:27.882319 2571 generic.go:358] "Generic (PLEG): container finished" podID="0017fa53-28a3-4fbd-a39c-9a40c5def571" containerID="0a060ec9784afcc577542bb6734b3338c25b513df75ba7253e86f78897548ec4" exitCode=0 Apr 20 21:24:27.882437 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:24:27.882395 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-85ff7bd8dc-p2z8j" Apr 20 21:24:27.882437 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:24:27.882406 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-85ff7bd8dc-p2z8j" event={"ID":"0017fa53-28a3-4fbd-a39c-9a40c5def571","Type":"ContainerDied","Data":"0a060ec9784afcc577542bb6734b3338c25b513df75ba7253e86f78897548ec4"} Apr 20 21:24:27.882957 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:24:27.882446 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-85ff7bd8dc-p2z8j" event={"ID":"0017fa53-28a3-4fbd-a39c-9a40c5def571","Type":"ContainerDied","Data":"92a60f82ffc4a04cc35ae11448efe595bf21b81282c44233b6c42c0f61f449f8"} Apr 20 21:24:27.882957 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:24:27.882463 2571 scope.go:117] "RemoveContainer" containerID="0a060ec9784afcc577542bb6734b3338c25b513df75ba7253e86f78897548ec4" Apr 20 21:24:27.891876 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:24:27.891849 2571 scope.go:117] "RemoveContainer" containerID="0a060ec9784afcc577542bb6734b3338c25b513df75ba7253e86f78897548ec4" Apr 20 21:24:27.892232 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:24:27.892207 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a060ec9784afcc577542bb6734b3338c25b513df75ba7253e86f78897548ec4\": container with ID starting with 0a060ec9784afcc577542bb6734b3338c25b513df75ba7253e86f78897548ec4 not found: ID does not exist" containerID="0a060ec9784afcc577542bb6734b3338c25b513df75ba7253e86f78897548ec4" Apr 20 21:24:27.892287 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:24:27.892249 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a060ec9784afcc577542bb6734b3338c25b513df75ba7253e86f78897548ec4"} err="failed to get container status \"0a060ec9784afcc577542bb6734b3338c25b513df75ba7253e86f78897548ec4\": rpc error: code = NotFound desc = could not find container \"0a060ec9784afcc577542bb6734b3338c25b513df75ba7253e86f78897548ec4\": container with ID starting with 0a060ec9784afcc577542bb6734b3338c25b513df75ba7253e86f78897548ec4 not found: ID does not exist" Apr 20 21:24:27.910457 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:24:27.910419 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-85ff7bd8dc-p2z8j"] Apr 20 21:24:27.913761 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:24:27.913730 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-85ff7bd8dc-p2z8j"] Apr 20 21:24:28.342273 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:24:28.342239 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0017fa53-28a3-4fbd-a39c-9a40c5def571" path="/var/lib/kubelet/pods/0017fa53-28a3-4fbd-a39c-9a40c5def571/volumes" Apr 20 21:25:56.379520 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:25:56.379480 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-b7b7fc65d-vdpfj"] Apr 20 21:25:56.380086 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:25:56.380017 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0017fa53-28a3-4fbd-a39c-9a40c5def571" containerName="authorino" Apr 20 21:25:56.380086 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:25:56.380038 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0017fa53-28a3-4fbd-a39c-9a40c5def571" containerName="authorino" Apr 20 21:25:56.380244 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:25:56.380126 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="0017fa53-28a3-4fbd-a39c-9a40c5def571" containerName="authorino" Apr 20 21:25:56.383305 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:25:56.383282 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-b7b7fc65d-vdpfj" Apr 20 21:25:56.385456 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:25:56.385435 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-prkzx\"" Apr 20 21:25:56.393463 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:25:56.393439 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-b7b7fc65d-vdpfj"] Apr 20 21:25:56.406819 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:25:56.406781 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxrsr\" (UniqueName: \"kubernetes.io/projected/186e0313-7256-4591-9648-082da62b08cf-kube-api-access-vxrsr\") pod \"maas-controller-b7b7fc65d-vdpfj\" (UID: \"186e0313-7256-4591-9648-082da62b08cf\") " pod="opendatahub/maas-controller-b7b7fc65d-vdpfj" Apr 20 21:25:56.507875 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:25:56.507837 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vxrsr\" (UniqueName: \"kubernetes.io/projected/186e0313-7256-4591-9648-082da62b08cf-kube-api-access-vxrsr\") pod \"maas-controller-b7b7fc65d-vdpfj\" (UID: \"186e0313-7256-4591-9648-082da62b08cf\") " pod="opendatahub/maas-controller-b7b7fc65d-vdpfj" Apr 20 21:25:56.515836 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:25:56.515800 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxrsr\" (UniqueName: \"kubernetes.io/projected/186e0313-7256-4591-9648-082da62b08cf-kube-api-access-vxrsr\") pod \"maas-controller-b7b7fc65d-vdpfj\" (UID: \"186e0313-7256-4591-9648-082da62b08cf\") " pod="opendatahub/maas-controller-b7b7fc65d-vdpfj" Apr 20 21:25:56.693394 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:25:56.693287 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-b7b7fc65d-vdpfj" Apr 20 21:25:56.824011 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:25:56.823980 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-b7b7fc65d-vdpfj"] Apr 20 21:25:56.825780 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:25:56.825749 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod186e0313_7256_4591_9648_082da62b08cf.slice/crio-6f614ee5befbe71f4f5ce8ccc4f1ed53147a8d1653acdf1b02e73ba248b2dc15 WatchSource:0}: Error finding container 6f614ee5befbe71f4f5ce8ccc4f1ed53147a8d1653acdf1b02e73ba248b2dc15: Status 404 returned error can't find the container with id 6f614ee5befbe71f4f5ce8ccc4f1ed53147a8d1653acdf1b02e73ba248b2dc15 Apr 20 21:25:57.224144 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:25:57.224107 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-b7b7fc65d-vdpfj" event={"ID":"186e0313-7256-4591-9648-082da62b08cf","Type":"ContainerStarted","Data":"6f614ee5befbe71f4f5ce8ccc4f1ed53147a8d1653acdf1b02e73ba248b2dc15"} Apr 20 21:25:58.229360 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:25:58.229321 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-b7b7fc65d-vdpfj" event={"ID":"186e0313-7256-4591-9648-082da62b08cf","Type":"ContainerStarted","Data":"25496adb4768d2eab59b31b9044a0f567c7235991b8bacbbd06e722df998ec56"} Apr 20 21:25:58.229721 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:25:58.229430 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-b7b7fc65d-vdpfj" Apr 20 21:25:58.244246 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:25:58.244155 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-b7b7fc65d-vdpfj" podStartSLOduration=1.762819864 podStartE2EDuration="2.244138364s" podCreationTimestamp="2026-04-20 21:25:56 +0000 UTC" firstStartedPulling="2026-04-20 21:25:56.827002718 +0000 UTC m=+767.069961816" lastFinishedPulling="2026-04-20 21:25:57.308321219 +0000 UTC m=+767.551280316" observedRunningTime="2026-04-20 21:25:58.242945082 +0000 UTC m=+768.485904202" watchObservedRunningTime="2026-04-20 21:25:58.244138364 +0000 UTC m=+768.487097485" Apr 20 21:26:09.238445 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:26:09.238413 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-b7b7fc65d-vdpfj" Apr 20 21:28:10.265502 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:28:10.265474 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hjvgw_838a3bd4-1a50-4127-a629-525bfede6ffd/console-operator/1.log" Apr 20 21:28:10.270595 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:28:10.270566 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hjvgw_838a3bd4-1a50-4127-a629-525bfede6ffd/console-operator/1.log" Apr 20 21:28:10.271228 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:28:10.271204 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6skp_82c75868-1659-4814-b726-ba733f5f2ebc/ovn-acl-logging/0.log" Apr 20 21:28:10.275442 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:28:10.275421 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6skp_82c75868-1659-4814-b726-ba733f5f2ebc/ovn-acl-logging/0.log" Apr 20 21:30:00.138880 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:30:00.138642 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29612010-9hcpd"] Apr 20 21:30:00.142464 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:30:00.142448 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29612010-9hcpd" Apr 20 21:30:00.144364 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:30:00.144344 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-nd7zd\"" Apr 20 21:30:00.154226 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:30:00.154205 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29612010-9hcpd"] Apr 20 21:30:00.219203 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:30:00.219147 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72s8j\" (UniqueName: \"kubernetes.io/projected/af74f9ab-b027-4c3d-aa5e-9bdccad76954-kube-api-access-72s8j\") pod \"maas-api-key-cleanup-29612010-9hcpd\" (UID: \"af74f9ab-b027-4c3d-aa5e-9bdccad76954\") " pod="opendatahub/maas-api-key-cleanup-29612010-9hcpd" Apr 20 21:30:00.320285 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:30:00.320251 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-72s8j\" (UniqueName: \"kubernetes.io/projected/af74f9ab-b027-4c3d-aa5e-9bdccad76954-kube-api-access-72s8j\") pod \"maas-api-key-cleanup-29612010-9hcpd\" (UID: \"af74f9ab-b027-4c3d-aa5e-9bdccad76954\") " pod="opendatahub/maas-api-key-cleanup-29612010-9hcpd" Apr 20 21:30:00.327497 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:30:00.327473 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-72s8j\" (UniqueName: \"kubernetes.io/projected/af74f9ab-b027-4c3d-aa5e-9bdccad76954-kube-api-access-72s8j\") pod \"maas-api-key-cleanup-29612010-9hcpd\" (UID: \"af74f9ab-b027-4c3d-aa5e-9bdccad76954\") " pod="opendatahub/maas-api-key-cleanup-29612010-9hcpd" Apr 20 21:30:00.453256 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:30:00.453164 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29612010-9hcpd" Apr 20 21:30:00.585290 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:30:00.585161 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29612010-9hcpd"] Apr 20 21:30:00.587760 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:30:00.587728 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf74f9ab_b027_4c3d_aa5e_9bdccad76954.slice/crio-3d3ddbebfb61c86e294fe82aa8edb52be4b135ec6e5cc4b8e5a0219364d66417 WatchSource:0}: Error finding container 3d3ddbebfb61c86e294fe82aa8edb52be4b135ec6e5cc4b8e5a0219364d66417: Status 404 returned error can't find the container with id 3d3ddbebfb61c86e294fe82aa8edb52be4b135ec6e5cc4b8e5a0219364d66417 Apr 20 21:30:00.589546 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:30:00.589529 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 21:30:01.145063 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:30:01.145023 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612010-9hcpd" event={"ID":"af74f9ab-b027-4c3d-aa5e-9bdccad76954","Type":"ContainerStarted","Data":"3d3ddbebfb61c86e294fe82aa8edb52be4b135ec6e5cc4b8e5a0219364d66417"} Apr 20 21:30:03.154069 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:30:03.154030 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612010-9hcpd" event={"ID":"af74f9ab-b027-4c3d-aa5e-9bdccad76954","Type":"ContainerStarted","Data":"a6f4f799b1d59342923fa0ae11618bb431d00796a761800005d065d8dc33f769"} Apr 20 21:30:03.167350 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:30:03.167298 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29612010-9hcpd" podStartSLOduration=1.8911702639999999 podStartE2EDuration="3.167281336s" podCreationTimestamp="2026-04-20 21:30:00 +0000 UTC" firstStartedPulling="2026-04-20 21:30:00.589659132 +0000 UTC m=+1010.832618231" lastFinishedPulling="2026-04-20 21:30:01.865770192 +0000 UTC m=+1012.108729303" observedRunningTime="2026-04-20 21:30:03.166832055 +0000 UTC m=+1013.409791176" watchObservedRunningTime="2026-04-20 21:30:03.167281336 +0000 UTC m=+1013.410240456" Apr 20 21:30:23.242862 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:30:23.242819 2571 generic.go:358] "Generic (PLEG): container finished" podID="af74f9ab-b027-4c3d-aa5e-9bdccad76954" containerID="a6f4f799b1d59342923fa0ae11618bb431d00796a761800005d065d8dc33f769" exitCode=6 Apr 20 21:30:23.243273 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:30:23.242893 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612010-9hcpd" event={"ID":"af74f9ab-b027-4c3d-aa5e-9bdccad76954","Type":"ContainerDied","Data":"a6f4f799b1d59342923fa0ae11618bb431d00796a761800005d065d8dc33f769"} Apr 20 21:30:23.243273 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:30:23.243198 2571 scope.go:117] "RemoveContainer" containerID="a6f4f799b1d59342923fa0ae11618bb431d00796a761800005d065d8dc33f769" Apr 20 21:30:24.253248 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:30:24.253209 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612010-9hcpd" event={"ID":"af74f9ab-b027-4c3d-aa5e-9bdccad76954","Type":"ContainerStarted","Data":"2ede31d54b736f8e0d09af1a57c1e899a9dae0bd7949509328e9479ad9c320bc"} Apr 20 21:30:44.329711 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:30:44.329679 2571 generic.go:358] "Generic (PLEG): container finished" podID="af74f9ab-b027-4c3d-aa5e-9bdccad76954" containerID="2ede31d54b736f8e0d09af1a57c1e899a9dae0bd7949509328e9479ad9c320bc" exitCode=6 Apr 20 21:30:44.330210 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:30:44.329751 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612010-9hcpd" event={"ID":"af74f9ab-b027-4c3d-aa5e-9bdccad76954","Type":"ContainerDied","Data":"2ede31d54b736f8e0d09af1a57c1e899a9dae0bd7949509328e9479ad9c320bc"} Apr 20 21:30:44.330210 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:30:44.329793 2571 scope.go:117] "RemoveContainer" containerID="a6f4f799b1d59342923fa0ae11618bb431d00796a761800005d065d8dc33f769" Apr 20 21:30:44.330210 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:30:44.330089 2571 scope.go:117] "RemoveContainer" containerID="2ede31d54b736f8e0d09af1a57c1e899a9dae0bd7949509328e9479ad9c320bc" Apr 20 21:30:44.330375 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:30:44.330354 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29612010-9hcpd_opendatahub(af74f9ab-b027-4c3d-aa5e-9bdccad76954)\"" pod="opendatahub/maas-api-key-cleanup-29612010-9hcpd" podUID="af74f9ab-b027-4c3d-aa5e-9bdccad76954" Apr 20 21:30:55.337304 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:30:55.337277 2571 scope.go:117] "RemoveContainer" containerID="2ede31d54b736f8e0d09af1a57c1e899a9dae0bd7949509328e9479ad9c320bc" Apr 20 21:30:56.378020 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:30:56.377978 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612010-9hcpd" event={"ID":"af74f9ab-b027-4c3d-aa5e-9bdccad76954","Type":"ContainerStarted","Data":"e3423b90ee77888c7a5fd86245e3725f56315dac132a26d0ea6c5b316cdb5173"} Apr 20 21:30:57.401359 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:30:57.401284 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29612010-9hcpd"] Apr 20 21:30:57.401720 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:30:57.401501 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29612010-9hcpd" podUID="af74f9ab-b027-4c3d-aa5e-9bdccad76954" containerName="cleanup" containerID="cri-o://e3423b90ee77888c7a5fd86245e3725f56315dac132a26d0ea6c5b316cdb5173" gracePeriod=30 Apr 20 21:31:16.153563 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:31:16.153537 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29612010-9hcpd" Apr 20 21:31:16.265393 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:31:16.265355 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72s8j\" (UniqueName: \"kubernetes.io/projected/af74f9ab-b027-4c3d-aa5e-9bdccad76954-kube-api-access-72s8j\") pod \"af74f9ab-b027-4c3d-aa5e-9bdccad76954\" (UID: \"af74f9ab-b027-4c3d-aa5e-9bdccad76954\") " Apr 20 21:31:16.267294 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:31:16.267272 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af74f9ab-b027-4c3d-aa5e-9bdccad76954-kube-api-access-72s8j" (OuterVolumeSpecName: "kube-api-access-72s8j") pod "af74f9ab-b027-4c3d-aa5e-9bdccad76954" (UID: "af74f9ab-b027-4c3d-aa5e-9bdccad76954"). InnerVolumeSpecName "kube-api-access-72s8j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:31:16.366697 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:31:16.366670 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-72s8j\" (UniqueName: \"kubernetes.io/projected/af74f9ab-b027-4c3d-aa5e-9bdccad76954-kube-api-access-72s8j\") on node \"ip-10-0-129-149.ec2.internal\" DevicePath \"\"" Apr 20 21:31:16.449302 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:31:16.449272 2571 generic.go:358] "Generic (PLEG): container finished" podID="af74f9ab-b027-4c3d-aa5e-9bdccad76954" containerID="e3423b90ee77888c7a5fd86245e3725f56315dac132a26d0ea6c5b316cdb5173" exitCode=6 Apr 20 21:31:16.449423 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:31:16.449350 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612010-9hcpd" event={"ID":"af74f9ab-b027-4c3d-aa5e-9bdccad76954","Type":"ContainerDied","Data":"e3423b90ee77888c7a5fd86245e3725f56315dac132a26d0ea6c5b316cdb5173"} Apr 20 21:31:16.449423 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:31:16.449364 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29612010-9hcpd" Apr 20 21:31:16.449423 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:31:16.449379 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612010-9hcpd" event={"ID":"af74f9ab-b027-4c3d-aa5e-9bdccad76954","Type":"ContainerDied","Data":"3d3ddbebfb61c86e294fe82aa8edb52be4b135ec6e5cc4b8e5a0219364d66417"} Apr 20 21:31:16.449423 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:31:16.449396 2571 scope.go:117] "RemoveContainer" containerID="e3423b90ee77888c7a5fd86245e3725f56315dac132a26d0ea6c5b316cdb5173" Apr 20 21:31:16.457739 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:31:16.457716 2571 scope.go:117] "RemoveContainer" containerID="2ede31d54b736f8e0d09af1a57c1e899a9dae0bd7949509328e9479ad9c320bc" Apr 20 21:31:16.463043 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:31:16.463021 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29612010-9hcpd"] Apr 20 21:31:16.466508 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:31:16.466488 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29612010-9hcpd"] Apr 20 21:31:16.466760 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:31:16.466745 2571 scope.go:117] "RemoveContainer" containerID="e3423b90ee77888c7a5fd86245e3725f56315dac132a26d0ea6c5b316cdb5173" Apr 20 21:31:16.467002 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:31:16.466987 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3423b90ee77888c7a5fd86245e3725f56315dac132a26d0ea6c5b316cdb5173\": container with ID starting with e3423b90ee77888c7a5fd86245e3725f56315dac132a26d0ea6c5b316cdb5173 not found: ID does not exist" containerID="e3423b90ee77888c7a5fd86245e3725f56315dac132a26d0ea6c5b316cdb5173" Apr 20 21:31:16.467044 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:31:16.467010 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3423b90ee77888c7a5fd86245e3725f56315dac132a26d0ea6c5b316cdb5173"} err="failed to get container status \"e3423b90ee77888c7a5fd86245e3725f56315dac132a26d0ea6c5b316cdb5173\": rpc error: code = NotFound desc = could not find container \"e3423b90ee77888c7a5fd86245e3725f56315dac132a26d0ea6c5b316cdb5173\": container with ID starting with e3423b90ee77888c7a5fd86245e3725f56315dac132a26d0ea6c5b316cdb5173 not found: ID does not exist" Apr 20 21:31:16.467044 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:31:16.467026 2571 scope.go:117] "RemoveContainer" containerID="2ede31d54b736f8e0d09af1a57c1e899a9dae0bd7949509328e9479ad9c320bc" Apr 20 21:31:16.467234 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:31:16.467218 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ede31d54b736f8e0d09af1a57c1e899a9dae0bd7949509328e9479ad9c320bc\": container with ID starting with 2ede31d54b736f8e0d09af1a57c1e899a9dae0bd7949509328e9479ad9c320bc not found: ID does not exist" containerID="2ede31d54b736f8e0d09af1a57c1e899a9dae0bd7949509328e9479ad9c320bc" Apr 20 21:31:16.467278 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:31:16.467241 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ede31d54b736f8e0d09af1a57c1e899a9dae0bd7949509328e9479ad9c320bc"} err="failed to get container status \"2ede31d54b736f8e0d09af1a57c1e899a9dae0bd7949509328e9479ad9c320bc\": rpc error: code = NotFound desc = could not find container \"2ede31d54b736f8e0d09af1a57c1e899a9dae0bd7949509328e9479ad9c320bc\": container with ID starting with 2ede31d54b736f8e0d09af1a57c1e899a9dae0bd7949509328e9479ad9c320bc not found: ID does not exist" Apr 20 21:31:18.341788 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:31:18.341750 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af74f9ab-b027-4c3d-aa5e-9bdccad76954" path="/var/lib/kubelet/pods/af74f9ab-b027-4c3d-aa5e-9bdccad76954/volumes" Apr 20 21:33:10.294400 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:33:10.294370 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hjvgw_838a3bd4-1a50-4127-a629-525bfede6ffd/console-operator/1.log" Apr 20 21:33:10.298615 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:33:10.298591 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6skp_82c75868-1659-4814-b726-ba733f5f2ebc/ovn-acl-logging/0.log" Apr 20 21:33:10.298876 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:33:10.298861 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hjvgw_838a3bd4-1a50-4127-a629-525bfede6ffd/console-operator/1.log" Apr 20 21:33:10.303490 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:33:10.303474 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6skp_82c75868-1659-4814-b726-ba733f5f2ebc/ovn-acl-logging/0.log" Apr 20 21:38:10.321294 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:38:10.321264 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hjvgw_838a3bd4-1a50-4127-a629-525bfede6ffd/console-operator/1.log" Apr 20 21:38:10.326012 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:38:10.325990 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6skp_82c75868-1659-4814-b726-ba733f5f2ebc/ovn-acl-logging/0.log" Apr 20 21:38:10.327048 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:38:10.327030 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hjvgw_838a3bd4-1a50-4127-a629-525bfede6ffd/console-operator/1.log" Apr 20 21:38:10.331461 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:38:10.331443 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6skp_82c75868-1659-4814-b726-ba733f5f2ebc/ovn-acl-logging/0.log" Apr 20 21:43:10.354725 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:43:10.354696 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hjvgw_838a3bd4-1a50-4127-a629-525bfede6ffd/console-operator/1.log" Apr 20 21:43:10.359459 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:43:10.359435 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6skp_82c75868-1659-4814-b726-ba733f5f2ebc/ovn-acl-logging/0.log" Apr 20 21:43:10.361252 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:43:10.361227 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hjvgw_838a3bd4-1a50-4127-a629-525bfede6ffd/console-operator/1.log" Apr 20 21:43:10.365635 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:43:10.365620 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6skp_82c75868-1659-4814-b726-ba733f5f2ebc/ovn-acl-logging/0.log" Apr 20 21:45:00.134911 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:45:00.134829 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29612025-6vglv"] Apr 20 21:45:00.135365 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:45:00.135216 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af74f9ab-b027-4c3d-aa5e-9bdccad76954" containerName="cleanup" Apr 20 21:45:00.135365 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:45:00.135229 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="af74f9ab-b027-4c3d-aa5e-9bdccad76954" containerName="cleanup" Apr 20 21:45:00.135365 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:45:00.135242 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af74f9ab-b027-4c3d-aa5e-9bdccad76954" containerName="cleanup" Apr 20 21:45:00.135365 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:45:00.135250 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="af74f9ab-b027-4c3d-aa5e-9bdccad76954" containerName="cleanup" Apr 20 21:45:00.135365 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:45:00.135326 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="af74f9ab-b027-4c3d-aa5e-9bdccad76954" containerName="cleanup" Apr 20 21:45:00.135365 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:45:00.135339 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="af74f9ab-b027-4c3d-aa5e-9bdccad76954" containerName="cleanup" Apr 20 21:45:00.135365 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:45:00.135348 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="af74f9ab-b027-4c3d-aa5e-9bdccad76954" containerName="cleanup" Apr 20 21:45:00.138379 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:45:00.138363 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29612025-6vglv" Apr 20 21:45:00.140287 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:45:00.140267 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-nd7zd\"" Apr 20 21:45:00.152853 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:45:00.152833 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29612025-6vglv"] Apr 20 21:45:00.295593 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:45:00.295558 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-496tv\" (UniqueName: \"kubernetes.io/projected/55c754bf-8b50-4296-ad9c-584d7049fe70-kube-api-access-496tv\") pod \"maas-api-key-cleanup-29612025-6vglv\" (UID: \"55c754bf-8b50-4296-ad9c-584d7049fe70\") " pod="opendatahub/maas-api-key-cleanup-29612025-6vglv" Apr 20 21:45:00.396212 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:45:00.396115 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-496tv\" (UniqueName: \"kubernetes.io/projected/55c754bf-8b50-4296-ad9c-584d7049fe70-kube-api-access-496tv\") pod \"maas-api-key-cleanup-29612025-6vglv\" (UID: \"55c754bf-8b50-4296-ad9c-584d7049fe70\") " pod="opendatahub/maas-api-key-cleanup-29612025-6vglv" Apr 20 21:45:00.404095 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:45:00.404064 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-496tv\" (UniqueName: \"kubernetes.io/projected/55c754bf-8b50-4296-ad9c-584d7049fe70-kube-api-access-496tv\") pod \"maas-api-key-cleanup-29612025-6vglv\" (UID: \"55c754bf-8b50-4296-ad9c-584d7049fe70\") " pod="opendatahub/maas-api-key-cleanup-29612025-6vglv" Apr 20 21:45:00.448611 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:45:00.448587 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29612025-6vglv" Apr 20 21:45:00.568361 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:45:00.568319 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29612025-6vglv"] Apr 20 21:45:00.570587 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:45:00.570554 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55c754bf_8b50_4296_ad9c_584d7049fe70.slice/crio-2f4c4ab8883ff928c4cca9076f42ace623cd823cfa063a836b1b236cb5fdb817 WatchSource:0}: Error finding container 2f4c4ab8883ff928c4cca9076f42ace623cd823cfa063a836b1b236cb5fdb817: Status 404 returned error can't find the container with id 2f4c4ab8883ff928c4cca9076f42ace623cd823cfa063a836b1b236cb5fdb817 Apr 20 21:45:00.572505 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:45:00.572490 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 21:45:01.496241 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:45:01.496210 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612025-6vglv" event={"ID":"55c754bf-8b50-4296-ad9c-584d7049fe70","Type":"ContainerStarted","Data":"54d8bd01e86c62fcac024eee6b49f117018ea97faaf77589126a36925167e62d"} Apr 20 21:45:01.496656 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:45:01.496248 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612025-6vglv" event={"ID":"55c754bf-8b50-4296-ad9c-584d7049fe70","Type":"ContainerStarted","Data":"2f4c4ab8883ff928c4cca9076f42ace623cd823cfa063a836b1b236cb5fdb817"} Apr 20 21:45:01.516273 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:45:01.511466 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29612025-6vglv" podStartSLOduration=1.5114466009999998 podStartE2EDuration="1.511446601s" podCreationTimestamp="2026-04-20 21:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:45:01.508344032 +0000 UTC m=+1911.751303151" watchObservedRunningTime="2026-04-20 21:45:01.511446601 +0000 UTC m=+1911.754405722" Apr 20 21:45:21.568153 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:45:21.568088 2571 generic.go:358] "Generic (PLEG): container finished" podID="55c754bf-8b50-4296-ad9c-584d7049fe70" containerID="54d8bd01e86c62fcac024eee6b49f117018ea97faaf77589126a36925167e62d" exitCode=6 Apr 20 21:45:21.568153 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:45:21.568123 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612025-6vglv" event={"ID":"55c754bf-8b50-4296-ad9c-584d7049fe70","Type":"ContainerDied","Data":"54d8bd01e86c62fcac024eee6b49f117018ea97faaf77589126a36925167e62d"} Apr 20 21:45:21.568536 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:45:21.568376 2571 scope.go:117] "RemoveContainer" containerID="54d8bd01e86c62fcac024eee6b49f117018ea97faaf77589126a36925167e62d" Apr 20 21:45:22.573380 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:45:22.573346 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612025-6vglv" event={"ID":"55c754bf-8b50-4296-ad9c-584d7049fe70","Type":"ContainerStarted","Data":"668a59d42273fada64dc7af86f051be8722ea29b7f821c4c4632ac74067efc17"} Apr 20 21:45:42.643785 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:45:42.643748 2571 generic.go:358] "Generic (PLEG): container finished" podID="55c754bf-8b50-4296-ad9c-584d7049fe70" containerID="668a59d42273fada64dc7af86f051be8722ea29b7f821c4c4632ac74067efc17" exitCode=6 Apr 20 21:45:42.644275 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:45:42.643825 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612025-6vglv" event={"ID":"55c754bf-8b50-4296-ad9c-584d7049fe70","Type":"ContainerDied","Data":"668a59d42273fada64dc7af86f051be8722ea29b7f821c4c4632ac74067efc17"} Apr 20 21:45:42.644275 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:45:42.643879 2571 scope.go:117] "RemoveContainer" containerID="54d8bd01e86c62fcac024eee6b49f117018ea97faaf77589126a36925167e62d" Apr 20 21:45:42.644275 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:45:42.644121 2571 scope.go:117] "RemoveContainer" containerID="668a59d42273fada64dc7af86f051be8722ea29b7f821c4c4632ac74067efc17" Apr 20 21:45:42.644518 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:45:42.644319 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29612025-6vglv_opendatahub(55c754bf-8b50-4296-ad9c-584d7049fe70)\"" pod="opendatahub/maas-api-key-cleanup-29612025-6vglv" podUID="55c754bf-8b50-4296-ad9c-584d7049fe70" Apr 20 21:45:54.337674 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:45:54.337644 2571 scope.go:117] "RemoveContainer" containerID="668a59d42273fada64dc7af86f051be8722ea29b7f821c4c4632ac74067efc17" Apr 20 21:45:54.689771 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:45:54.689712 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612025-6vglv" event={"ID":"55c754bf-8b50-4296-ad9c-584d7049fe70","Type":"ContainerStarted","Data":"3cfb22875556dff0326efbddba84d2fc9551c6ea37b1b4e8a8ca0d9f0acbbca5"} Apr 20 21:45:55.360523 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:45:55.360490 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29612025-6vglv"] Apr 20 21:45:55.693145 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:45:55.693058 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29612025-6vglv" podUID="55c754bf-8b50-4296-ad9c-584d7049fe70" containerName="cleanup" containerID="cri-o://3cfb22875556dff0326efbddba84d2fc9551c6ea37b1b4e8a8ca0d9f0acbbca5" gracePeriod=30 Apr 20 21:46:15.131978 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:15.131955 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29612025-6vglv" Apr 20 21:46:15.184369 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:15.184336 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-496tv\" (UniqueName: \"kubernetes.io/projected/55c754bf-8b50-4296-ad9c-584d7049fe70-kube-api-access-496tv\") pod \"55c754bf-8b50-4296-ad9c-584d7049fe70\" (UID: \"55c754bf-8b50-4296-ad9c-584d7049fe70\") " Apr 20 21:46:15.186419 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:15.186382 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55c754bf-8b50-4296-ad9c-584d7049fe70-kube-api-access-496tv" (OuterVolumeSpecName: "kube-api-access-496tv") pod "55c754bf-8b50-4296-ad9c-584d7049fe70" (UID: "55c754bf-8b50-4296-ad9c-584d7049fe70"). InnerVolumeSpecName "kube-api-access-496tv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:46:15.285672 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:15.285649 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-496tv\" (UniqueName: \"kubernetes.io/projected/55c754bf-8b50-4296-ad9c-584d7049fe70-kube-api-access-496tv\") on node \"ip-10-0-129-149.ec2.internal\" DevicePath \"\"" Apr 20 21:46:15.762634 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:15.762602 2571 generic.go:358] "Generic (PLEG): container finished" podID="55c754bf-8b50-4296-ad9c-584d7049fe70" containerID="3cfb22875556dff0326efbddba84d2fc9551c6ea37b1b4e8a8ca0d9f0acbbca5" exitCode=6 Apr 20 21:46:15.762932 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:15.762666 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29612025-6vglv" Apr 20 21:46:15.762932 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:15.762665 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612025-6vglv" event={"ID":"55c754bf-8b50-4296-ad9c-584d7049fe70","Type":"ContainerDied","Data":"3cfb22875556dff0326efbddba84d2fc9551c6ea37b1b4e8a8ca0d9f0acbbca5"} Apr 20 21:46:15.762932 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:15.762714 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612025-6vglv" event={"ID":"55c754bf-8b50-4296-ad9c-584d7049fe70","Type":"ContainerDied","Data":"2f4c4ab8883ff928c4cca9076f42ace623cd823cfa063a836b1b236cb5fdb817"} Apr 20 21:46:15.762932 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:15.762736 2571 scope.go:117] "RemoveContainer" containerID="3cfb22875556dff0326efbddba84d2fc9551c6ea37b1b4e8a8ca0d9f0acbbca5" Apr 20 21:46:15.771373 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:15.771356 2571 scope.go:117] "RemoveContainer" containerID="668a59d42273fada64dc7af86f051be8722ea29b7f821c4c4632ac74067efc17" Apr 20 21:46:15.780674 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:15.780656 2571 scope.go:117] "RemoveContainer" containerID="3cfb22875556dff0326efbddba84d2fc9551c6ea37b1b4e8a8ca0d9f0acbbca5" Apr 20 21:46:15.780927 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:46:15.780907 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cfb22875556dff0326efbddba84d2fc9551c6ea37b1b4e8a8ca0d9f0acbbca5\": container with ID starting with 3cfb22875556dff0326efbddba84d2fc9551c6ea37b1b4e8a8ca0d9f0acbbca5 not found: ID does not exist" containerID="3cfb22875556dff0326efbddba84d2fc9551c6ea37b1b4e8a8ca0d9f0acbbca5" Apr 20 21:46:15.781008 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:15.780939 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cfb22875556dff0326efbddba84d2fc9551c6ea37b1b4e8a8ca0d9f0acbbca5"} err="failed to get container status \"3cfb22875556dff0326efbddba84d2fc9551c6ea37b1b4e8a8ca0d9f0acbbca5\": rpc error: code = NotFound desc = could not find container \"3cfb22875556dff0326efbddba84d2fc9551c6ea37b1b4e8a8ca0d9f0acbbca5\": container with ID starting with 3cfb22875556dff0326efbddba84d2fc9551c6ea37b1b4e8a8ca0d9f0acbbca5 not found: ID does not exist" Apr 20 21:46:15.781008 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:15.780963 2571 scope.go:117] "RemoveContainer" containerID="668a59d42273fada64dc7af86f051be8722ea29b7f821c4c4632ac74067efc17" Apr 20 21:46:15.781429 ip-10-0-129-149 kubenswrapper[2571]: E0420 21:46:15.781353 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"668a59d42273fada64dc7af86f051be8722ea29b7f821c4c4632ac74067efc17\": container with ID starting with 668a59d42273fada64dc7af86f051be8722ea29b7f821c4c4632ac74067efc17 not found: ID does not exist" containerID="668a59d42273fada64dc7af86f051be8722ea29b7f821c4c4632ac74067efc17" Apr 20 21:46:15.781429 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:15.781384 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"668a59d42273fada64dc7af86f051be8722ea29b7f821c4c4632ac74067efc17"} err="failed to get container status \"668a59d42273fada64dc7af86f051be8722ea29b7f821c4c4632ac74067efc17\": rpc error: code = NotFound desc = could not find container \"668a59d42273fada64dc7af86f051be8722ea29b7f821c4c4632ac74067efc17\": container with ID starting with 668a59d42273fada64dc7af86f051be8722ea29b7f821c4c4632ac74067efc17 not found: ID does not exist" Apr 20 21:46:15.782459 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:15.782435 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29612025-6vglv"] Apr 20 21:46:15.784285 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:15.784265 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29612025-6vglv"] Apr 20 21:46:16.342714 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:16.342689 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55c754bf-8b50-4296-ad9c-584d7049fe70" path="/var/lib/kubelet/pods/55c754bf-8b50-4296-ad9c-584d7049fe70/volumes" Apr 20 21:46:47.598445 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:47.598386 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7d5fd76f94-q8rtt_f76b5c7f-0653-4eab-b706-ebff9786f2e5/authorino/0.log" Apr 20 21:46:51.617871 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:51.617817 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-b7b7fc65d-vdpfj_186e0313-7256-4591-9648-082da62b08cf/manager/0.log" Apr 20 21:46:51.841288 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:51.841236 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-85fc55dd88-5pr8n_9b6b2586-750e-4c97-bd26-6712ca103c6d/manager/0.log" Apr 20 21:46:53.002320 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:53.002282 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s4t89_d712d209-faa7-4304-af59-07b4282072e9/util/0.log" Apr 20 21:46:53.008257 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:53.008238 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s4t89_d712d209-faa7-4304-af59-07b4282072e9/pull/0.log" Apr 20 21:46:53.013999 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:53.013985 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s4t89_d712d209-faa7-4304-af59-07b4282072e9/extract/0.log" Apr 20 21:46:53.131101 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:53.131061 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hwqj4_ee1835fe-3c55-4dad-a004-b108263d101e/extract/0.log" Apr 20 21:46:53.137410 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:53.137380 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hwqj4_ee1835fe-3c55-4dad-a004-b108263d101e/util/0.log" Apr 20 21:46:53.144030 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:53.144014 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hwqj4_ee1835fe-3c55-4dad-a004-b108263d101e/pull/0.log" Apr 20 21:46:53.259652 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:53.259580 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736s58h_71fe2008-8bc9-49f7-90c6-8caefcc51c20/extract/0.log" Apr 20 21:46:53.265518 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:53.265493 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736s58h_71fe2008-8bc9-49f7-90c6-8caefcc51c20/util/0.log" Apr 20 21:46:53.271468 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:53.271455 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736s58h_71fe2008-8bc9-49f7-90c6-8caefcc51c20/pull/0.log" Apr 20 21:46:53.509429 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:53.509403 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7d5fd76f94-q8rtt_f76b5c7f-0653-4eab-b706-ebff9786f2e5/authorino/0.log" Apr 20 21:46:53.742869 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:53.742847 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-5mjpq_a2b5b080-0f37-4b48-a783-aeaff522e227/manager/0.log" Apr 20 21:46:53.848821 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:53.848796 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-476zv_aef360e0-704a-4637-997b-87e74a9488bc/kuadrant-console-plugin/0.log" Apr 20 21:46:54.319385 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:54.319361 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-sbtxk_a84abe7e-a70e-49c4-8943-79c66da4e3a0/manager/0.log" Apr 20 21:46:54.676413 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:54.676351 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw_cfc0536e-7bf6-4f10-ab67-6859ef69f0be/istio-proxy/0.log" Apr 20 21:46:55.138071 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:55.138036 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-sqlf6_44bb23ed-dd9f-450b-b0ea-db4715d86492/istio-proxy/0.log" Apr 20 21:46:55.259004 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:55.258975 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-79b56464d-ztxdm_299bba46-6418-4baf-8a89-6db7597a7bc4/router/0.log" Apr 20 21:46:55.936802 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:55.936774 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7_36c013e0-a4ff-4ced-9363-069ce4cc1c44/storage-initializer/0.log" Apr 20 21:46:55.943599 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:55.943581 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc72lj7_36c013e0-a4ff-4ced-9363-069ce4cc1c44/main/0.log" Apr 20 21:46:59.788671 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:59.788634 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rs84k/must-gather-zzzc7"] Apr 20 21:46:59.789049 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:59.788977 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55c754bf-8b50-4296-ad9c-584d7049fe70" containerName="cleanup" Apr 20 21:46:59.789049 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:59.788988 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="55c754bf-8b50-4296-ad9c-584d7049fe70" containerName="cleanup" Apr 20 21:46:59.789049 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:59.789013 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55c754bf-8b50-4296-ad9c-584d7049fe70" containerName="cleanup" Apr 20 21:46:59.789049 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:59.789018 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="55c754bf-8b50-4296-ad9c-584d7049fe70" containerName="cleanup" Apr 20 21:46:59.789049 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:59.789027 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af74f9ab-b027-4c3d-aa5e-9bdccad76954" containerName="cleanup" Apr 20 21:46:59.789049 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:59.789033 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="af74f9ab-b027-4c3d-aa5e-9bdccad76954" containerName="cleanup" Apr 20 21:46:59.789261 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:59.789086 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="55c754bf-8b50-4296-ad9c-584d7049fe70" containerName="cleanup" Apr 20 21:46:59.789261 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:59.789093 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="55c754bf-8b50-4296-ad9c-584d7049fe70" containerName="cleanup" Apr 20 21:46:59.789261 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:59.789154 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55c754bf-8b50-4296-ad9c-584d7049fe70" containerName="cleanup" Apr 20 21:46:59.789261 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:59.789159 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="55c754bf-8b50-4296-ad9c-584d7049fe70" containerName="cleanup" Apr 20 21:46:59.789261 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:59.789228 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="55c754bf-8b50-4296-ad9c-584d7049fe70" containerName="cleanup" Apr 20 21:46:59.792327 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:59.792306 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rs84k/must-gather-zzzc7" Apr 20 21:46:59.794398 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:59.794378 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-rs84k\"/\"default-dockercfg-b9scs\"" Apr 20 21:46:59.794492 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:59.794423 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-rs84k\"/\"openshift-service-ca.crt\"" Apr 20 21:46:59.794492 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:59.794426 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-rs84k\"/\"kube-root-ca.crt\"" Apr 20 21:46:59.802392 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:59.802368 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rs84k/must-gather-zzzc7"] Apr 20 21:46:59.940528 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:59.940498 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz9mj\" (UniqueName: \"kubernetes.io/projected/9b3dac98-cbd0-43a5-a74d-bfe4fd2d30a0-kube-api-access-wz9mj\") pod \"must-gather-zzzc7\" (UID: \"9b3dac98-cbd0-43a5-a74d-bfe4fd2d30a0\") " pod="openshift-must-gather-rs84k/must-gather-zzzc7" Apr 20 21:46:59.940739 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:46:59.940600 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9b3dac98-cbd0-43a5-a74d-bfe4fd2d30a0-must-gather-output\") pod \"must-gather-zzzc7\" (UID: \"9b3dac98-cbd0-43a5-a74d-bfe4fd2d30a0\") " pod="openshift-must-gather-rs84k/must-gather-zzzc7" Apr 20 21:47:00.041425 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:00.041348 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9b3dac98-cbd0-43a5-a74d-bfe4fd2d30a0-must-gather-output\") pod \"must-gather-zzzc7\" (UID: \"9b3dac98-cbd0-43a5-a74d-bfe4fd2d30a0\") " pod="openshift-must-gather-rs84k/must-gather-zzzc7" Apr 20 21:47:00.041562 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:00.041426 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wz9mj\" (UniqueName: \"kubernetes.io/projected/9b3dac98-cbd0-43a5-a74d-bfe4fd2d30a0-kube-api-access-wz9mj\") pod \"must-gather-zzzc7\" (UID: \"9b3dac98-cbd0-43a5-a74d-bfe4fd2d30a0\") " pod="openshift-must-gather-rs84k/must-gather-zzzc7" Apr 20 21:47:00.041687 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:00.041669 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9b3dac98-cbd0-43a5-a74d-bfe4fd2d30a0-must-gather-output\") pod \"must-gather-zzzc7\" (UID: \"9b3dac98-cbd0-43a5-a74d-bfe4fd2d30a0\") " pod="openshift-must-gather-rs84k/must-gather-zzzc7" Apr 20 21:47:00.048888 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:00.048863 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz9mj\" (UniqueName: \"kubernetes.io/projected/9b3dac98-cbd0-43a5-a74d-bfe4fd2d30a0-kube-api-access-wz9mj\") pod \"must-gather-zzzc7\" (UID: \"9b3dac98-cbd0-43a5-a74d-bfe4fd2d30a0\") " pod="openshift-must-gather-rs84k/must-gather-zzzc7" Apr 20 21:47:00.102003 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:00.101974 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rs84k/must-gather-zzzc7" Apr 20 21:47:00.429977 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:00.429958 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rs84k/must-gather-zzzc7"] Apr 20 21:47:00.435836 ip-10-0-129-149 kubenswrapper[2571]: W0420 21:47:00.435804 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b3dac98_cbd0_43a5_a74d_bfe4fd2d30a0.slice/crio-a3f754451e1c24cd6c3ae006fccf7eb6970d54ae1aab93191735d5779e3c9ce1 WatchSource:0}: Error finding container a3f754451e1c24cd6c3ae006fccf7eb6970d54ae1aab93191735d5779e3c9ce1: Status 404 returned error can't find the container with id a3f754451e1c24cd6c3ae006fccf7eb6970d54ae1aab93191735d5779e3c9ce1 Apr 20 21:47:00.931099 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:00.931068 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rs84k/must-gather-zzzc7" event={"ID":"9b3dac98-cbd0-43a5-a74d-bfe4fd2d30a0","Type":"ContainerStarted","Data":"a3f754451e1c24cd6c3ae006fccf7eb6970d54ae1aab93191735d5779e3c9ce1"} Apr 20 21:47:01.936676 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:01.936569 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rs84k/must-gather-zzzc7" event={"ID":"9b3dac98-cbd0-43a5-a74d-bfe4fd2d30a0","Type":"ContainerStarted","Data":"cc17d6187a6ca694d6951587a6622ace60c2dcb03d7885e41ee32173cbba46ee"} Apr 20 21:47:01.936676 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:01.936622 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rs84k/must-gather-zzzc7" event={"ID":"9b3dac98-cbd0-43a5-a74d-bfe4fd2d30a0","Type":"ContainerStarted","Data":"df33fcdca911eb43073e0d268dc40f0be83813544db6456f9f809cf5ef7959bc"} Apr 20 21:47:01.956986 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:01.956925 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rs84k/must-gather-zzzc7" podStartSLOduration=2.145870671 podStartE2EDuration="2.956905493s" podCreationTimestamp="2026-04-20 21:46:59 +0000 UTC" firstStartedPulling="2026-04-20 21:47:00.437584287 +0000 UTC m=+2030.680543384" lastFinishedPulling="2026-04-20 21:47:01.248619105 +0000 UTC m=+2031.491578206" observedRunningTime="2026-04-20 21:47:01.954826588 +0000 UTC m=+2032.197785709" watchObservedRunningTime="2026-04-20 21:47:01.956905493 +0000 UTC m=+2032.199864612" Apr 20 21:47:02.944512 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:02.944480 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-h84cc_f42114d2-f0ca-4f06-bdf6-49ec62ba06a3/global-pull-secret-syncer/0.log" Apr 20 21:47:03.092457 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:03.092419 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-nztmh_86a41be0-f642-425a-a950-24cf589ab648/konnectivity-agent/0.log" Apr 20 21:47:03.150436 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:03.150400 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-149.ec2.internal_bc561987305693c23e8fd4b20c60c28f/haproxy/0.log" Apr 20 21:47:07.058825 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:07.058791 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s4t89_d712d209-faa7-4304-af59-07b4282072e9/extract/0.log" Apr 20 21:47:07.089200 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:07.089156 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s4t89_d712d209-faa7-4304-af59-07b4282072e9/util/0.log" Apr 20 21:47:07.115708 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:07.115677 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s4t89_d712d209-faa7-4304-af59-07b4282072e9/pull/0.log" Apr 20 21:47:07.148902 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:07.148866 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hwqj4_ee1835fe-3c55-4dad-a004-b108263d101e/extract/0.log" Apr 20 21:47:07.170623 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:07.170596 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hwqj4_ee1835fe-3c55-4dad-a004-b108263d101e/util/0.log" Apr 20 21:47:07.192629 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:07.192601 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0hwqj4_ee1835fe-3c55-4dad-a004-b108263d101e/pull/0.log" Apr 20 21:47:07.217485 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:07.217444 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736s58h_71fe2008-8bc9-49f7-90c6-8caefcc51c20/extract/0.log" Apr 20 21:47:07.242474 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:07.242443 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736s58h_71fe2008-8bc9-49f7-90c6-8caefcc51c20/util/0.log" Apr 20 21:47:07.263554 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:07.263520 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736s58h_71fe2008-8bc9-49f7-90c6-8caefcc51c20/pull/0.log" Apr 20 21:47:07.609237 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:07.609206 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7d5fd76f94-q8rtt_f76b5c7f-0653-4eab-b706-ebff9786f2e5/authorino/0.log" Apr 20 21:47:07.663857 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:07.663821 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-5mjpq_a2b5b080-0f37-4b48-a783-aeaff522e227/manager/0.log" Apr 20 21:47:07.686403 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:07.686371 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-476zv_aef360e0-704a-4637-997b-87e74a9488bc/kuadrant-console-plugin/0.log" Apr 20 21:47:07.875820 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:07.875749 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-sbtxk_a84abe7e-a70e-49c4-8943-79c66da4e3a0/manager/0.log" Apr 20 21:47:09.552145 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:09.552115 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-jltn6_01779d30-26d6-410c-b5b1-a6f02ae25857/monitoring-plugin/0.log" Apr 20 21:47:09.646114 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:09.646073 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nmpg7_67b17fac-14b3-453d-b0aa-0062a9cf986e/node-exporter/0.log" Apr 20 21:47:09.665504 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:09.665456 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nmpg7_67b17fac-14b3-453d-b0aa-0062a9cf986e/kube-rbac-proxy/0.log" Apr 20 21:47:09.685529 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:09.685495 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nmpg7_67b17fac-14b3-453d-b0aa-0062a9cf986e/init-textfile/0.log" Apr 20 21:47:09.775977 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:09.775948 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-826s9_0226bfa3-7ec0-490e-944a-1e60da426ea1/kube-rbac-proxy-main/0.log" Apr 20 21:47:09.795666 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:09.795628 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-826s9_0226bfa3-7ec0-490e-944a-1e60da426ea1/kube-rbac-proxy-self/0.log" Apr 20 21:47:09.816755 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:09.816686 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-826s9_0226bfa3-7ec0-490e-944a-1e60da426ea1/openshift-state-metrics/0.log" Apr 20 21:47:11.284242 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:11.284206 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rs84k/perf-node-gather-daemonset-sw9gk"] Apr 20 21:47:11.291343 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:11.291310 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-sw9gk" Apr 20 21:47:11.293350 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:11.293318 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rs84k/perf-node-gather-daemonset-sw9gk"] Apr 20 21:47:11.452133 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:11.452090 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmz7q\" (UniqueName: \"kubernetes.io/projected/5a3950ac-2fed-4b3e-804b-b21feb755137-kube-api-access-hmz7q\") pod \"perf-node-gather-daemonset-sw9gk\" (UID: \"5a3950ac-2fed-4b3e-804b-b21feb755137\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-sw9gk" Apr 20 21:47:11.452308 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:11.452234 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5a3950ac-2fed-4b3e-804b-b21feb755137-lib-modules\") pod \"perf-node-gather-daemonset-sw9gk\" (UID: \"5a3950ac-2fed-4b3e-804b-b21feb755137\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-sw9gk" Apr 20 21:47:11.452308 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:11.452265 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5a3950ac-2fed-4b3e-804b-b21feb755137-proc\") pod \"perf-node-gather-daemonset-sw9gk\" (UID: \"5a3950ac-2fed-4b3e-804b-b21feb755137\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-sw9gk" Apr 20 21:47:11.452308 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:11.452288 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5a3950ac-2fed-4b3e-804b-b21feb755137-podres\") pod \"perf-node-gather-daemonset-sw9gk\" (UID: \"5a3950ac-2fed-4b3e-804b-b21feb755137\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-sw9gk" Apr 20 21:47:11.452467 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:11.452417 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5a3950ac-2fed-4b3e-804b-b21feb755137-sys\") pod \"perf-node-gather-daemonset-sw9gk\" (UID: \"5a3950ac-2fed-4b3e-804b-b21feb755137\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-sw9gk" Apr 20 21:47:11.553443 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:11.553356 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5a3950ac-2fed-4b3e-804b-b21feb755137-sys\") pod \"perf-node-gather-daemonset-sw9gk\" (UID: \"5a3950ac-2fed-4b3e-804b-b21feb755137\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-sw9gk" Apr 20 21:47:11.553612 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:11.553442 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hmz7q\" (UniqueName: \"kubernetes.io/projected/5a3950ac-2fed-4b3e-804b-b21feb755137-kube-api-access-hmz7q\") pod \"perf-node-gather-daemonset-sw9gk\" (UID: \"5a3950ac-2fed-4b3e-804b-b21feb755137\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-sw9gk" Apr 20 21:47:11.553612 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:11.553462 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5a3950ac-2fed-4b3e-804b-b21feb755137-sys\") pod \"perf-node-gather-daemonset-sw9gk\" (UID: \"5a3950ac-2fed-4b3e-804b-b21feb755137\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-sw9gk" Apr 20 21:47:11.553612 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:11.553573 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5a3950ac-2fed-4b3e-804b-b21feb755137-lib-modules\") pod \"perf-node-gather-daemonset-sw9gk\" (UID: \"5a3950ac-2fed-4b3e-804b-b21feb755137\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-sw9gk" Apr 20 21:47:11.553784 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:11.553629 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5a3950ac-2fed-4b3e-804b-b21feb755137-proc\") pod \"perf-node-gather-daemonset-sw9gk\" (UID: \"5a3950ac-2fed-4b3e-804b-b21feb755137\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-sw9gk" Apr 20 21:47:11.553784 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:11.553683 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5a3950ac-2fed-4b3e-804b-b21feb755137-podres\") pod \"perf-node-gather-daemonset-sw9gk\" (UID: \"5a3950ac-2fed-4b3e-804b-b21feb755137\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-sw9gk" Apr 20 21:47:11.553784 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:11.553733 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5a3950ac-2fed-4b3e-804b-b21feb755137-lib-modules\") pod \"perf-node-gather-daemonset-sw9gk\" (UID: \"5a3950ac-2fed-4b3e-804b-b21feb755137\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-sw9gk" Apr 20 21:47:11.553784 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:11.553741 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5a3950ac-2fed-4b3e-804b-b21feb755137-proc\") pod \"perf-node-gather-daemonset-sw9gk\" (UID: \"5a3950ac-2fed-4b3e-804b-b21feb755137\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-sw9gk" Apr 20 21:47:11.553978 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:11.553888 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5a3950ac-2fed-4b3e-804b-b21feb755137-podres\") pod \"perf-node-gather-daemonset-sw9gk\" (UID: \"5a3950ac-2fed-4b3e-804b-b21feb755137\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-sw9gk" Apr 20 21:47:11.560758 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:11.560735 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmz7q\" (UniqueName: \"kubernetes.io/projected/5a3950ac-2fed-4b3e-804b-b21feb755137-kube-api-access-hmz7q\") pod \"perf-node-gather-daemonset-sw9gk\" (UID: \"5a3950ac-2fed-4b3e-804b-b21feb755137\") " pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-sw9gk" Apr 20 21:47:11.609078 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:11.609045 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-sw9gk" Apr 20 21:47:11.775890 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:11.775805 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hjvgw_838a3bd4-1a50-4127-a629-525bfede6ffd/console-operator/1.log" Apr 20 21:47:11.776913 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:11.776805 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rs84k/perf-node-gather-daemonset-sw9gk"] Apr 20 21:47:11.781543 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:11.781448 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hjvgw_838a3bd4-1a50-4127-a629-525bfede6ffd/console-operator/2.log" Apr 20 21:47:11.988230 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:11.988164 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-sw9gk" event={"ID":"5a3950ac-2fed-4b3e-804b-b21feb755137","Type":"ContainerStarted","Data":"1b4ff54bab958e0b0ef72addee4f459eeecba9daa8a20d69a03c777a7623b590"} Apr 20 21:47:11.988429 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:11.988240 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-sw9gk" event={"ID":"5a3950ac-2fed-4b3e-804b-b21feb755137","Type":"ContainerStarted","Data":"2214aeb5b95ec92d37857d5ca02db0ff4b6fb2a7654420d9675df628d1834f5b"} Apr 20 21:47:11.989085 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:11.989057 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-sw9gk" Apr 20 21:47:12.004605 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:12.004536 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-sw9gk" podStartSLOduration=1.004520892 podStartE2EDuration="1.004520892s" podCreationTimestamp="2026-04-20 21:47:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:47:12.002677273 +0000 UTC m=+2042.245636392" watchObservedRunningTime="2026-04-20 21:47:12.004520892 +0000 UTC m=+2042.247480296" Apr 20 21:47:12.243866 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:12.243836 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-s2hsb_ce5dfb59-8be5-484c-868f-587ecd9948e3/download-server/0.log" Apr 20 21:47:12.714831 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:12.714750 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-s4nps_6500e95b-38f5-4c5c-b0f3-f38cf82ffcb6/volume-data-source-validator/0.log" Apr 20 21:47:13.444880 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:13.444846 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-cqwmd_096e06e8-bf34-462d-9f43-fd87848fd09e/dns/0.log" Apr 20 21:47:13.463272 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:13.463244 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-cqwmd_096e06e8-bf34-462d-9f43-fd87848fd09e/kube-rbac-proxy/0.log" Apr 20 21:47:13.525469 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:13.525445 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9xgfk_47953ca1-cc2f-4035-8d59-26be8c7a9516/dns-node-resolver/0.log" Apr 20 21:47:14.004619 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:14.004594 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-56b9995c8d-m5vt9_adb3353d-8a6b-4b5d-9dbe-795907ebf77a/registry/0.log" Apr 20 21:47:14.023069 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:14.023043 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fwbvz_95826e15-25fd-44ed-bc3e-c54baaa50bb7/node-ca/0.log" Apr 20 21:47:14.826297 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:14.826270 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfz2qsw_cfc0536e-7bf6-4f10-ab67-6859ef69f0be/istio-proxy/0.log" Apr 20 21:47:15.016714 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:15.016678 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-sqlf6_44bb23ed-dd9f-450b-b0ea-db4715d86492/istio-proxy/0.log" Apr 20 21:47:15.035424 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:15.035398 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-79b56464d-ztxdm_299bba46-6418-4baf-8a89-6db7597a7bc4/router/0.log" Apr 20 21:47:15.502464 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:15.502439 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-ld4fl_08619f25-e76b-45d3-ab4b-8e9490d505f9/serve-healthcheck-canary/0.log" Apr 20 21:47:15.930531 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:15.930419 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-xxkhk_f6a03f80-2426-4087-b868-a71402310e22/insights-operator/1.log" Apr 20 21:47:15.930814 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:15.930677 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-xxkhk_f6a03f80-2426-4087-b868-a71402310e22/insights-operator/0.log" Apr 20 21:47:16.158906 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:16.158866 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-z4nxr_dd7b97ad-bb33-454d-96b8-cbbc807198ef/kube-rbac-proxy/0.log" Apr 20 21:47:16.178114 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:16.178091 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-z4nxr_dd7b97ad-bb33-454d-96b8-cbbc807198ef/exporter/0.log" Apr 20 21:47:16.197529 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:16.197481 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-z4nxr_dd7b97ad-bb33-454d-96b8-cbbc807198ef/extractor/0.log" Apr 20 21:47:18.029667 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:18.029636 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-b7b7fc65d-vdpfj_186e0313-7256-4591-9648-082da62b08cf/manager/0.log" Apr 20 21:47:18.082989 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:18.082955 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-85fc55dd88-5pr8n_9b6b2586-750e-4c97-bd26-6712ca103c6d/manager/0.log" Apr 20 21:47:19.009682 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:19.009650 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-rs84k/perf-node-gather-daemonset-sw9gk" Apr 20 21:47:19.250676 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:19.250637 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-56b87855f9-7pkwp_9bb6896d-7f01-46d8-9997-83170aa22077/manager/0.log" Apr 20 21:47:23.591748 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:23.591701 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-scg9x_d157b838-4286-4cdb-9399-eea3bc5bb5fd/migrator/0.log" Apr 20 21:47:23.617405 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:23.617384 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-scg9x_d157b838-4286-4cdb-9399-eea3bc5bb5fd/graceful-termination/0.log" Apr 20 21:47:23.950514 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:23.950402 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-8gwlz_36c761e2-ffbf-4d69-8d19-9b3793a3acf9/kube-storage-version-migrator-operator/1.log" Apr 20 21:47:23.951221 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:23.951172 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-8gwlz_36c761e2-ffbf-4d69-8d19-9b3793a3acf9/kube-storage-version-migrator-operator/0.log" Apr 20 21:47:24.895228 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:24.895198 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8mchz_cf801c74-93a7-4e27-ba8a-0c31596e95c6/kube-multus/0.log" Apr 20 21:47:25.059683 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:25.059658 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rhd4c_6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9/kube-multus-additional-cni-plugins/0.log" Apr 20 21:47:25.079248 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:25.079225 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rhd4c_6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9/egress-router-binary-copy/0.log" Apr 20 21:47:25.102698 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:25.102676 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rhd4c_6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9/cni-plugins/0.log" Apr 20 21:47:25.121416 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:25.121396 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rhd4c_6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9/bond-cni-plugin/0.log" Apr 20 21:47:25.140587 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:25.140567 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rhd4c_6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9/routeoverride-cni/0.log" Apr 20 21:47:25.161197 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:25.161116 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rhd4c_6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9/whereabouts-cni-bincopy/0.log" Apr 20 21:47:25.180095 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:25.180073 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rhd4c_6a5af7de-f9f6-43b2-aa6a-2a145cafbfb9/whereabouts-cni/0.log" Apr 20 21:47:25.434234 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:25.434143 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-fk9cw_89e3c54c-a866-4c9b-940d-54a417b5c964/network-metrics-daemon/0.log" Apr 20 21:47:25.451968 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:25.451943 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-fk9cw_89e3c54c-a866-4c9b-940d-54a417b5c964/kube-rbac-proxy/0.log" Apr 20 21:47:26.742208 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:26.742165 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6skp_82c75868-1659-4814-b726-ba733f5f2ebc/ovn-controller/0.log" Apr 20 21:47:26.757873 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:26.757848 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6skp_82c75868-1659-4814-b726-ba733f5f2ebc/ovn-acl-logging/0.log" Apr 20 21:47:26.769901 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:26.769874 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6skp_82c75868-1659-4814-b726-ba733f5f2ebc/ovn-acl-logging/1.log" Apr 20 21:47:26.787284 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:26.787260 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6skp_82c75868-1659-4814-b726-ba733f5f2ebc/kube-rbac-proxy-node/0.log" Apr 20 21:47:26.805639 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:26.805614 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6skp_82c75868-1659-4814-b726-ba733f5f2ebc/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 21:47:26.823086 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:26.823062 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6skp_82c75868-1659-4814-b726-ba733f5f2ebc/northd/0.log" Apr 20 21:47:26.841841 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:26.841824 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6skp_82c75868-1659-4814-b726-ba733f5f2ebc/nbdb/0.log" Apr 20 21:47:26.862308 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:26.862289 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6skp_82c75868-1659-4814-b726-ba733f5f2ebc/sbdb/0.log" Apr 20 21:47:26.970377 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:26.970308 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6skp_82c75868-1659-4814-b726-ba733f5f2ebc/ovnkube-controller/0.log" Apr 20 21:47:27.987404 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:27.987348 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-9nwmh_f41d2d0b-36e7-42ab-a7e1-486ca3970554/check-endpoints/0.log" Apr 20 21:47:28.031992 ip-10-0-129-149 kubenswrapper[2571]: I0420 21:47:28.031968 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-cxz7h_369b8c8d-720a-4d32-a69a-64bd50a8103a/network-check-target-container/0.log"